Course | Postgraduate |
Semester | Electives |
Subject Code | MA872 |
Subject Title | Advanced Optimization |
Unconstrained Optimization: line search method: Wolf condition, Goldstein condition, sufficient decrease and backtracking, Newtons method and Quazi Newton method; trust region method: the Cauchy point, algorithm based on Cauchy point, improving on the Cauchy point, the Dog- leg method, two-dimensional subspace reduction; nonlinear conjugate gradient method: the Fletcher Reeves method.
Constrained Optimization: penalty method, quadratic penalty method, convergence, non smooth penalty function, L1 penalty method, augmented Lagrangian method; quadratic programming, Schur complementary, null space method, active set method for convex QP; sequential quadratic programming, convex programming.
Same as Reference
Boyd, S. and Vandenberghe, L., Convex Optimization, Cambridge Univ. Press (2004).
Nocedel, J. and Wright, S. Numerical Optimization, Springer (2006).
CO1: Impart knowledge of advanced theory of optimization.
CO2: Familiarize with advanced algormths to solve optimization problems.
CO3: Write codes for optimization problems using advance algorithms.