ورود به حساب

نام کاربری گذرواژه

گذرواژه را فراموش کردید؟ کلیک کنید

حساب کاربری ندارید؟ ساخت حساب

ساخت حساب کاربری

نام نام کاربری ایمیل شماره موبایل گذرواژه

برای ارتباط با ما می توانید از طریق شماره موبایل زیر از طریق تماس و پیامک با ما در ارتباط باشید


09117307688
09117179751

در صورت عدم پاسخ گویی از طریق پیامک با پشتیبان در ارتباط باشید

دسترسی نامحدود

برای کاربرانی که ثبت نام کرده اند

ضمانت بازگشت وجه

درصورت عدم همخوانی توضیحات با کتاب

پشتیبانی

از ساعت 7 صبح تا 10 شب

دانلود کتاب Modern Numerical Nonlinear Optimization

دانلود کتاب بهینه سازی غیرخطی عددی مدرن

Modern Numerical Nonlinear Optimization

مشخصات کتاب

Modern Numerical Nonlinear Optimization

دسته بندی: بهینه سازی، تحقیق در عملیات.
ویرایش:  
نویسندگان:   
سری: Springer Optimization and Its Applications, 195 
ISBN (شابک) : 3031087194, 9783031087196 
ناشر: Springer 
سال نشر: 2022 
تعداد صفحات: 824 
زبان: English 
فرمت فایل : PDF (درصورت درخواست کاربر به PDF، EPUB یا AZW3 تبدیل می شود) 
حجم فایل: 22 مگابایت 

قیمت کتاب (تومان) : 52,000



ثبت امتیاز به این کتاب

میانگین امتیاز به این کتاب :
       تعداد امتیاز دهندگان : 3


در صورت تبدیل فایل کتاب Modern Numerical Nonlinear Optimization به فرمت های PDF، EPUB، AZW3، MOBI و یا DJVU می توانید به پشتیبان اطلاع دهید تا فایل مورد نظر را تبدیل نمایند.

توجه داشته باشید کتاب بهینه سازی غیرخطی عددی مدرن نسخه زبان اصلی می باشد و کتاب ترجمه شده به فارسی نمی باشد. وبسایت اینترنشنال لایبرری ارائه دهنده کتاب های زبان اصلی می باشد و هیچ گونه کتاب ترجمه شده یا نوشته شده به فارسی را ارائه نمی دهد.


توضیحاتی درمورد کتاب به خارجی



فهرست مطالب

Preface
Contents
List of Algorithms
List of Applications
List of Figures
List of Tables
1: Introduction
	1.1 Mathematical Modeling: Linguistic Models Versus Mathematical Models
	1.2 Mathematical Modeling and Computational Sciences
	1.3 The Modern Modeling Scheme for Optimization
	1.4 Classification of Optimization Problems
	1.5 Optimization Algorithms
	1.6 Collections of Applications for Numerical Experiments
	1.7 Comparison of Algorithms
	1.8 The Structure of the Book
2: Fundamentals on Unconstrained Optimization. Stepsize Computation
	2.1 The Problem
	2.2 Fundamentals on the Convergence of the Line-Search Methods
	2.3 The General Algorithm for Unconstrained Optimization
	2.4 Convergence of the Algorithm with Exact Line-Search
	2.5 Inexact Line-Search Methods
	2.6 Convergence of the Algorithm with Inexact Line-Search
	2.7 Three Fortran Implementations of the Inexact Line-Search
	2.8 Numerical Studies: Stepsize Computation
3: Steepest Descent Methods
	3.1 The Steepest Descent
		Convergence of the Steepest Descent Method for Quadratic Functions
		Inequality of Kantorovich
		Numerical Study
		Convergence of the Steepest Descent Method for General Functions
	3.2 The Relaxed Steepest Descent
		Numerical Study: SDB Versus RSDB
	3.3 The Accelerated Steepest Descent
		Numerical Study
	3.4 Comments on the Acceleration Scheme
4: The Newton Method
	4.1 The Newton Method for Solving Nonlinear Algebraic Systems
	4.2 The Gauss-Newton Method
	4.3 The Newton Method for Function Minimization
	4.4 The Newton Method with Line-Search
	4.5 Analysis of Complexity
	4.6 The Modified Newton Method
	4.7 The Newton Method with Finite-Differences
	4.8 Errors in Functions, Gradients, and Hessians
	4.9 Negative Curvature Direction Methods
	4.10 The Composite Newton Method
5: Conjugate Gradient Methods
	5.1 The Concept of Nonlinear Conjugate Gradient
	5.2 The Linear Conjugate Gradient Method
		The Linear Conjugate Gradient Algorithm
		Convergence Rate of the Linear Conjugate Gradient Algorithm
		Preconditioning
		Incomplete Cholesky Factorization
		Comparison of the Convergence Rate of the Linear Conjugate Gradient and of the Steepest Descent
	5.3 General Convergence Results for Nonlinear Conjugate Gradient Methods
		Convergence Under the Strong Wolfe Line-Search
		Convergence Under the Wolfe Line-Search
	5.4 Standard Conjugate Gradient Methods
		Conjugate Gradient Methods with gk+12 in the Numerator of βk
			The Fletcher-Reeves Method
			The CD Method
			The Dai-Yuan Method
		Conjugate Gradient Methods with  in the Numerator of βk
			The Polak-Ribière-Polyak Method
			The Hestenes-Stiefel Method
			The Liu-Storey Method
		Numerical Study: Standard Conjugate Gradient Methods
	5.5 Hybrid Conjugate Gradient Methods
		Hybrid Conjugate Gradient Methods Based on the Projection Concept
		Numerical Study: Hybrid Conjugate Gradient Methods
		Hybrid Conjugate Gradient Methods as Convex Combinations of the Standard Conjugate Gradient Methods
		The Hybrid Convex Combination of LS and DY
		Numerical Study: NDLSDY
	5.6 Conjugate Gradient Methods as Modifications of the Standard Schemes
		The Dai-Liao Conjugate Gradient Method
		The Conjugate Gradient with Guaranteed Descent (CG-DESCENT)
		Numerical Study: CG-DESCENT
		The Conjugate Gradient with Guaranteed Descent and Conjugacy Conditions and a Modified Wolfe Line-Search (DESCON)
		Numerical Study: DESCON
	5.7 Conjugate Gradient Methods Memoryless BFGS Preconditioned
		The Memoryless BFGS Preconditioned Conjugate Gradient (CONMIN)
		Numerical Study: CONMIN
		The Conjugate Gradient Method Closest to the Scaled Memoryless BFGS Search Direction (DK / CGOPT)
		Numerical Study: DK/CGOPT
	5.8 Solving Large-Scale Applications
6: Quasi-Newton Methods
	6.1 DFP and BFGS Methods
	6.2 Modifications of the BFGS Method
	6.3 Quasi-Newton Methods with Diagonal Updating of the Hessian
	6.4 Limited-Memory Quasi-Newton Methods
	6.5 The SR1 Method
	6.6 Sparse Quasi-Newton Updates
	6.7 Quasi-Newton Methods and Separable Functions
	6.8 Solving Large-Scale Applications
7: Inexact Newton Methods
	7.1 The Inexact Newton Method for Nonlinear Algebraic Systems
	7.2 Inexact Newton Methods for Functions Minimization
	7.3 The Line-Search Newton-CG Method
	7.4 Comparison of TN Versus Conjugate Gradient Algorithms
	7.5 Comparison of TN Versus L-BFGS
	7.6 Solving Large-Scale Applications
8: The Trust-Region Method
	8.1 The Trust-Region
	8.2 Algorithms Based on the Cauchy Point
	8.3 The Trust-Region Newton-CG Method
	8.4 The Global Convergence
	8.5 Iterative Solution of the Subproblem
	8.6 The Scaled Trust-Region
9: Direct Methods for Unconstrained Optimization
	9.1 The NELMED Algorithm
	9.2 The NEWUOA Algorithm
	9.3 The DEEPS Algorithm
	9.4 Numerical Study: NELMED, NEWUOA, and DEEPS
10: Constrained Nonlinear Optimization Methods: An Overview
	10.1 Convergence Tests
	10.2 Infeasible Points
	10.3 Approximate Subproblem: Local Models and Their Solving
	10.4 Globalization Strategy: Convergence from Remote Starting Points
	10.5 The Refining the Local Model
11: Optimality Conditions for Nonlinear Optimization
	11.1 General Concepts in Nonlinear Optimization
	11.2 Optimality Conditions for Unconstrained Optimization
	11.3 Optimality Conditions for Problems with Inequality Constraints
	11.4 Optimality Conditions for Problems with Equality Constraints
	11.5 Optimality Conditions for General Nonlinear Optimization Problems
	11.6 Duality
12: Simple Bound Constrained Optimization
	12.1 Necessary Conditions for Optimality
	12.2 Sufficient Conditions for Optimality
	12.3 Methods for Solving Simple Bound Optimization Problems
	12.4 The Spectral Projected Gradient Method (SPG)
		Numerical Study-SPG: Quadratic Interpolation versus Cubic Interpolation
	12.5 L-BFGS with Simple Bounds (L-BFGS-B)
		Numerical Study: L-BFGS-B Versus SPG
	12.6 Truncated Newton with Simple Bounds (TNBC)
	12.7 Applications
		Application A1 (Elastic-Plastic Torsion)
		Application A2 (Pressure Distribution in a Journal Bearing)
		Application A3 (Optimal Design with Composite Materials)
		Application A4 (Steady-State Combustion)
		Application A6 (Inhomogeneous Superconductors: 1-D Ginzburg-Landau)
13: Quadratic Programming
	13.1 Equality Constrained Quadratic Programming
		Factorization of the Full KKT System
		The Schur-Complement Method
		The Null-Space Method
		Large-Scale Problems
		The Conjugate Gradient Applied to the Reduced System
		The Projected Conjugate Gradient Method
	13.2 Inequality Constrained Quadratic Programming
		The Primal Active-Set Method
		An Algorithm for Positive Definite Hessian
		Reduced Gradient for Inequality Constraints
		The Reduced Gradient for Simple Bounds
		The Primal-Dual Active-Set Method
	13.3 Interior Point Methods
		Stepsize Selection
	13.4 Methods for Convex QP Problems with Equality Constraints
	13.5 Quadratic Programming with Simple Bounds: The Gradient Projection Method
		The Cauchy Point
		Subproblem Minimization
	13.6 Elimination of Variables
14: Penalty and Augmented Lagrangian Methods
	14.1 The Quadratic Penalty Method
	14.2 The Nonsmooth Penalty Method
	14.3 The Augmented Lagrangian Method
	14.4 Criticism of the Penalty and Augmented Lagrangian Methods
	14.5 A Penalty-Barrier Algorithm (SPENBAR)
		The Penalty-Barrier Method
		Global Convergence
		Numerical Study-SPENBAR: Solving Applications from the LACOP Collection
	14.6 The Linearly Constrained Augmented Lagrangian (MINOS)
		MINOS for Linear Constraints
		Numerical Study: MINOS for Linear Programming
		MINOS for Nonlinear Constraints
		Numerical Study-MINOS: Solving Applications from the LACOP Collection
15: Sequential Quadratic Programming
	15.1 A Simple Approach to SQP
	15.2 Reduced-Hessian Quasi-Newton Approximations
	15.3 Merit Functions
	15.4 Second-Order Correction (Maratos Effect)
	15.5 The Line-Search SQP Algorithm
	15.6 The Trust-Region SQP Algorithm
	15.7 Sequential Linear-Quadratic Programming (SLQP)
	15.8 A SQP Algorithm for Large-Scale-Constrained Optimization (SNOPT)
	15.9 A SQP Algorithm with Successive Error Restoration (NLPQLP)
	15.10 Active-Set Sequential Linear-Quadratic Programming (KNITRO/ACTIVE)
16: Primal Methods: The Generalized Reduced Gradient with Sequential Linearization
	16.1 Feasible Direction Methods
	16.2 Active Set Methods
	16.3 The Gradient Projection Method
	16.4 The Reduced Gradient Method
	16.5 The Convex Simplex Method
	16.6 The Generalized Reduced Gradient Method (GRG)
	16.7 GRG with Sequential Linear or Sequential Quadratic Programming (CONOPT)
17: Interior-Point Methods
	17.1 Prototype of the Interior-Point Algorithm
	17.2 Aspects of the Algorithmic Developments
	17.3 Line-Search Interior-Point Algorithm
	17.4 A Variant of the Line-Search Interior-Point Algorithm
	17.5 Trust-Region Interior-Point Algorithm
	17.6 Interior-Point Sequential Linear-Quadratic Programming (KNITRO/INTERIOR)
18: Filter Methods
	18.1 Sequential Linear Programming Filter Algorithm
	18.2 Sequential Quadratic Programming Filter Algorithm
19: Interior-Point Filter Line-Search
	19.1 Basic Algorithm IPOPT
		The Primal-Dual Barrier Approach
		Solving the Barrier Problem
		Line-Search Filter Method
		Second-Order Corrections
		The Algorithm
	19.2 Implementation Details
		General Lower and Upper Bounds
		Initialization
		Handling Unbounded Solution Sets
		Inertia Correction
		Automatic Scaling of the Problem
		Feasibility Restoration Phase
		Numerical Study-IPOPT: Solving Applications from the LACOP Collection
20: Direct Methods for Constrained Optimization
	20.1 COBYLA Algorithm
		Numerical Study-COBYLA Algorithm
	20.2 DFL Algorithm
		Numerical Study-DFL Algorithm
Appendix A: Mathematical Review
	A1. Elements of Applied Numerical Linear Algebra
		Vectors
		Norms of Vectors
		Matrices
		Matrix Norms
		Subspaces
		Inverse of a Matrix
		Orthogonality
		Eigenvalues
		Positive Definite Matrices
		Gaussian Elimination (LU Factorization)
			Gaussian Elimination with Partial Pivoting
			Gaussian Elimination with Complete Pivoting
		Cholesky Factorization
		Modified Cholesky Factorization
		QR Decomposition
		Singular Value Decomposition
		Spectral Decomposition (Symmetric Eigenvalue Decomposition)
		Elementary Matrices
		Conditioning and Stability
		Determinant of a Matrix
		Trace of a matrix
	A2. Elements of Analysis
		Rates of Convergence
		Finite-Difference Derivative Estimates
		Automatic Differentiation
		Order Notation
	A3. Elements of Topology in the Euclidian Space n
	A4. Elements of Convexity:Convex Sets and Convex Functions
		Convex Sets
		Convex Functions
		Strong Convexity
Appendix B: The SMUNO Collection
	Small-Scale Continuous Unconstrained Optimization Applications
Appendix C: The LACOP Collection
	Large-Scale Continuous Nonlinear Optimization Applications
Appendix D: The MINPACK-2 Collection
	Large-Scale Unconstrained Optimization Applications
References
Author Index
Subject Index




نظرات کاربران