Optimization Theory Course Major Assignment: Implementation of Fundamental Optimization Algorithms

Resource Overview

This source code package constitutes my major assignment for the Optimization Theory course, featuring self-implemented versions of the following prevalent optimization algorithms: Steepest Descent Method, Newton's Method, Nonlinear Least Squares Method, and DFP (Davidon-Fletcher-Powell) Method. The implementation includes two test functions, fun1 and fun2, designed to validate algorithm performance and convergence behavior across different optimization landscapes.

Detailed Documentation

During the Optimization Theory course this semester, I developed a comprehensive source code package containing self-implemented versions of fundamental optimization algorithms. The implementation encompasses four key methods: Steepest Descent (utilizing gradient information for iterative improvement), Newton's Method (leveraging second-order derivatives for faster convergence), Nonlinear Least Squares (addressing curve-fitting problems with non-linear parameters), and DFP Method (a quasi-Newton approach approximating the Hessian matrix). The code structure includes proper function modularization with clear input/output parameters and termination criteria. Two test functions (fun1 and fun2) were incorporated to demonstrate algorithmic effectiveness across different optimization landscapes, featuring configurable initial points and convergence tolerance settings. Thank you for your time and consideration!