Nonlinear conjugate gradient methods for unconstrained optimization /

Saved in:
Bibliographic Details
Main Author: Andrei, Neculai
Corporate Author: ProQuest (Firm)
Format: Electronic eBook
Language:English
Published: Cham : Springer, 2020.
Series:Springer optimization and its applications ; v. 158.
Subjects:
Online Access:Connect to this title online (unlimited simultaneous users allowed; 325 uses per year)

MARC

LEADER 00000nam a2200000Ia 4500
001 b3826862
003 CStclU
005 20200807151042.5
006 m o d
007 cr |n|||||||||
008 200702s2020 sz ob 001 0 eng d
020 |a 9783030429508  |q (electronic bk.) 
020 |a 3030429504  |q (electronic bk.) 
020 |z 3030429490 
020 |z 9783030429492 
024 7 |a 10.1007/978-3-030-42 
024 7 |a 10.1007/978-3-030-42950-8 
035 |a (NhCcYBP)ebc6236239 
040 |a NhCcYBP  |c NhCcYBP 
050 4 |a QA218  |b .A53 2020 
072 7 |a PBU  |2 bicssc 
072 7 |a MAT003000  |2 bisacsh 
072 7 |a PBU  |2 thema 
082 0 4 |a 512.9/4  |2 23 
100 1 |a Andrei, Neculai. 
245 1 0 |a Nonlinear conjugate gradient methods for unconstrained optimization /  |c Neculai Andrei. 
260 |a Cham :  |b Springer,  |c 2020. 
300 |a 1 online resource. 
336 |a text  |b txt  |2 rdacontent 
337 |a computer  |b c  |2 rdamedia 
338 |a online resource  |b cr  |2 rdacarrier 
347 |a text file  |b PDF  |2 rda 
490 1 |a Springer optimization and its applications,  |x 1931-6828 ;  |v v. 158 
504 |a Includes bibliographical references and indexes. 
505 0 0 |a Machine generated contents note:   |g 1.  |t Introduction: Overview of Unconstrained Optimization --   |g 1.1.  |t Problem --   |g 1.2.  |t Line Search --   |g 1.3.  |t Optimality Conditions for Unconstrained Optimization --   |g 1.4.  |t Overview of Unconstrained Optimization Methods --   |g 1.4.1.  |t Steepest Descent Method --   |g 1.4.2.  |t Newton Method --   |g 1.4.3.  |t Quasi-Newton Methods --   |g 1.4.4.  |t Modifications of the BFGS Method --   |g 1.4.5.  |t Quasi-Newton Methods with Diagonal Updating of the Hessian --   |g 1.4.6.  |t Limited-Memory Quasi-Newton Methods --   |g 1.4.7.  |t Truncated Newton Methods --   |g 1.4.8.  |t Conjugate Gradient Methods --   |g 1.4.9.  |t Trust-Region Methods --   |g 1.4.10.  |t p-Regularized Methods --   |g 1.5.  |t Test Problems and Applications --   |g 1.6.  |t Numerical Experiments --   |t Notes and References --   |g 2.  |t Linear Conjugate Gradient Algorithm --   |g 2.1.  |t Line Search --   |g 2.2.  |t Fundamental Property of the Line Search Method with Conjugate Directions --   |g 2.3.  |t Linear Conjugate Gradient Algorithm --   |g 2.4.  |t Convergence Rate of the Linear Conjugate Gradient Algorithm --   |g 2.5.  |t Comparison of the Convergence Rate of the Linear Conjugate Gradient and of the Steepest Descent --   |g 2.6.  |t Preconditioning of the Linear Conjugate Gradient Algorithms --   |t Notes and References --   |g 3.  |t General Convergence Results for Nonlinear Conjugate Gradient Methods --   |g 3.1.  |t Types of Convergence --   |g 3.2.  |t Concept of Nonlinear Conjugate Gradient --   |g 3.3.  |t General Convergence Results for Nonlinear Conjugate Gradient Methods --   |g 3.3.1.  |t Convergence Under the Strong Wolfe Line Search --   |g 3.3.2.  |t Convergence Under the Standard Wolfe Line Search --   |g 3.4.  |t Criticism of the Convergence Results --   |t Notes and References --   |g 4.  |t Standard Conjugate Gradient Methods --   |g 4.1.  |t Conjugate Gradient Methods with   |2  in the Numerator of βk --   |g 4.2.  |t Conjugate Gradient Methods with gtk+ 1yk in the Numerator of βk --   |g 4.3.  |t Numerical Study --   |t Notes and References --   |g 5.  |t Acceleration of Conjugate Gradient Algorithms --   |g 5.1.  |t Standard Wolfe Line Search with Cubic Interpolation --   |g 5.2.  |t Acceleration of Nonlinear Conjugate Gradient Algorithms --   |g 5.3.  |t Numerical Study --   |t Notes and References --   |g 6.  |t Hybrid and Parameterized Conjugate Gradient Methods --   |g 6.1.  |t Hybrid Conjugate Gradient Methods Based on the Projection Concept --   |g 6.2.  |t Hybrid Conjugate Gradient Methods as Convex Combinations of the Standard Conjugate Gradient Methods --   |g 6.3.  |t Parameterized Conjugate Gradient Methods --   |t Notes and References --   |g 7.  |t Conjugate Gradient Methods as Modifications of the Standard Schemes --   |g 7.1.  |t Conjugate Gradient with Dai and Liao Conjugacy Condition (DL) --   |g 7.2.  |t Conjugate Gradient with Guaranteed Descent (CG-DESCENT) --   |g 7.3.  |t Conjugate Gradient with Guaranteed Descent and Conjugacy Conditions and a Modified Wolfe Line Search (DESCON) --   |t Notes and References --   |g 8.  |t Conjugate Gradient Methods Memoryless BFGS Preconditioned --   |g 8.1.  |t Conjugate Gradient Memoryless BFGS Preconditioned (CONMIN) --   |g 8.2.  |t Scaling Conjugate Gradient Memoryless BFGS Preconditioned (SCALCG) --   |g 8.3.  |t Conjugate Gradient Method Closest to Scaled Memoryless BFGS Search Direction (DK/CGOPT) --   |g 8.4.  |t New Conjugate Gradient Algorithms Based on Self-Scaling Memoryless BFGS Updating --   |t Notes and References --   |g 9.  |t Three-Term Conjugate Gradient Methods --   |g 9.1.  |t Three-Term Conjugate Gradient Method with Descent and Conjugacy Conditions (TTCG) --   |g 9.2.  |t Three-Term Conjugate Gradient Method with Subspace Minimization (TTS) --   |g 9.3.  |t Three-Term Conjugate Gradient Method with Minimization of One-Parameter Quadratic Model of Minimizing Function (TTDES) --   |t Notes and References --   |g 10.  |t Preconditioning of the Nonlinear Conjugate Gradient Algorithms --   |g 10.1.  |t Preconditioners Based on Diagonal Approximations to the Hessian --   |g 10.2.  |t Criticism of Preconditioning the Nonlinear Conjugate Gradient Algorithms --   |t Notes and References --   |g 11.  |t Other Conjugate Gradient Methods --   |g 11.1.  |t Eigenvalues Versus Singular Values in Conjugate Gradient Algorithms (CECG and SVCG) --   |g 11.2.  |t Conjugate Gradient Algorithm with Guaranteed Descent and Conjugacy Conditions (CGSYS) --   |g 11.3.  |t Combination of Conjugate Gradient with Limited-Memory BFGS Methods --   |g 11.4.  |t Conjugate Gradient with Subspace Minimization Based on Regularization Model of the Minimizing Function --   |t Notes and References --   |g 12.  |t Discussions, Conclusions, and Large-Scale Optimization --   |t Notes and References. 
533 |a Electronic reproduction.  |b Ann Arbor, MI  |n Available via World Wide Web. 
650 0 |a Conjugate gradient methods. 
650 0 |a Constrained optimization. 
710 2 |a ProQuest (Firm) 
776 0 8 |c Original  |z 3030429490  |z 9783030429492 
830 0 |a Springer optimization and its applications ;  |v v. 158. 
856 4 0 |u https://ebookcentral.proquest.com/lib/santaclara/detail.action?docID=6236239  |z Connect to this title online (unlimited simultaneous users allowed; 325 uses per year)  |t 0 
907 |a .b38268620  |b 211018  |c 211018 
998 |a uww  |b    |c m  |d z   |e l  |f eng  |g sz   |h 0 
917 |a GOBI ProQuest DDA 
919 |a .ulebk  |b 2020-07-09 
999 f f |i 933cded9-6dff-5010-aa01-92be8ffc4375  |s 71b7f6b9-9e20-55b2-abd2-e073c57d0c6a  |t 0