v0.1.3
LaplaceRedux v0.1.3
Closed issues:
- Add support for block-diagonal Hessian approximations (#12)
- Implement Generalized Gauss Newton (GGN) approximation (#17)
- Interface MLJ (#18)
- Add support for mini-batch training (#19)
- Refactor gradient and jacobians as multi-dimensional arrays (#20)
- Update Documentation (#22)
- 🏃🏽 Getting started (#24)
- 🚀 Lift off (#25)
- 🌯 Wrapping Up (#26)
- Add support for Last-Layer and Subnet Laplace (#27)
- Organise code base (#36)
- Move FormatCheck into separate file (#40)
Merged pull requests:
- 18 interface mlj (#21) (@pat-alt)
- Clean README, docs and docstrings (#23) (@pitmonticone)
- Complete CSE2000 contributions (#33) (@severinbratus)
- Improve docstrings (#35) (@severinbratus)
- FormatCheck into separate file (#41) (@pat-alt)
- sorting out compat issues (#56) (@pat-alt)
- CompatHelper: bump compat for Flux to 0.14, (keep existing compat) (#57) (@github-actions[bot])
- CompatHelper: bump compat for MLJFlux to 0.3, (keep existing compat) (#58) (@github-actions[bot])