Fine-tuning LLMs with LoRA and Hessian-free optimizers
-
Updated
Apr 15, 2024 - Python
Fine-tuning LLMs with LoRA and Hessian-free optimizers
Stochastic Second-Order Methods in JAX
On the New method of Hessian-free second-order optimization
Implementation of Adaptive Hessian-free optimization.
Implementation of numerical optimization algorithms for logistic regression problem.
Tensorflow implement of Meta Learning with Hessian Free Approach in Deep Neural Nets Training
Matrix-multiplication-only KFAC; Code for ICML 2023 paper on Simplifying Momentum-based Positive-definite Submanifold Optimization with Applications to Deep Learning
Course project for CS771A: Introduction to Machine Learning
PyTorch implementation of the Hessian-free optimizer
PyTorch implementation of Hessian Free optimisation
ADAHESSIAN: An Adaptive Second Order Optimizer for Machine Learning
PyHessian is a Pytorch library for second-order based analysis and training of Neural Networks
Add a description, image, and links to the hessian-free topic page so that developers can more easily learn about it.
To associate your repository with the hessian-free topic, visit your repo's landing page and select "manage topics."