This report studies numerical optimization methods for logistic regression, emphasizing how curvature and conditioning determine algorithmic convergence.
In this project, we do the following:
- Derive gradient and Hessian expressions
- Establish convexity and strong convexity under L2 regularization,
- ompare fixed-step gradient descent, line-search variants, and Newton-type methods through controlled computational experiments.