I - Four labs:
- Proximal Algorithms for Non-smooth optimization (PGD, Accelerated PGD)
- Stochastic Optimization (SGD, SGDA, SVRG, SAG)
- Coordinate based algorithms (Proximal CD)
- Quasi Newton methods (DFP, BFGS, L-BFGS)
II - Final Project: This includes mathematical derivations and implmenting optimization algorithms to solve the quantile regression problem with a smoothed pinball loss and (Non-)smooth penalties. Algorithms include:
- Proximal Methods
- Coordinate methods
- Regular gradient descent
- L-BFGS quasi newton
- L1 and L2 proximal operators.