Skip to content

Commit

Permalink
final docs things for now
Browse files Browse the repository at this point in the history
  • Loading branch information
pat-alt committed Nov 21, 2022
1 parent 7e80a97 commit d1b2709
Show file tree
Hide file tree
Showing 9 changed files with 12,177 additions and 11,411 deletions.
2 changes: 1 addition & 1 deletion _freeze/docs/src/index/execute-results/md.json
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
{
"hash": "b9ccb333a18be1f46dee29b57ce312f5",
"result": {
"markdown": "```@meta\nCurrentModule = LaplaceRedux\n```\n\n# LaplaceRedux\n\nDocumentation for [LaplaceRedux.jl](https://github.com/pat-alt/LaplaceRedux.jl).\n\n\n\n`LaplaceRedux.jl` is a library written in pure Julia that can be used for effortless Bayesian Deep Learning trough Laplace Approximation (LA). In the development of this package I have drawn inspiration from this Python [library](https://aleximmer.github.io/Laplace/index.html#setup) and its companion [paper](https://arxiv.org/abs/2106.14806) [@daxberger2021laplace].\n\n## 🚩 Installation\n\nThe stable version of this package can be installed as follows:\n\n```{.julia}\nusing Pkg\nPkg.add(\"LaplaceRedux.jl\")\n```\n\nThe development version can be installed like so:\n\n```{.julia}\nusing Pkg\nPkg.add(\"https://github.com/pat-alt/LaplaceRedux.jl\")\n```\n\n## 🖥️ Basic Usage\n\n`LaplaceRedux.jl` can be used for any neural network trained in [`Flux.jl`](https://fluxml.ai/Flux.jl/dev/). Below we show basic usage examples involving two simple models for a regression and a classification task, respectively.\n\n### Regression\n\n\n\nA complete worked example for a regression model can be found in the[docs](https://www.paltmeyer.com/LaplaceRedux.jl/dev/tutorials/regression/). Here we jump straight to Laplace Approximation and take the pre-trained model `nn` as given. Then LA can be implemented as follows, where we specify the model `likelihood` and supply pre-determined values for the prior precision `λ` and the observational noise `σ`. The plot show the fitted values overlayed with a 95% confidence interval. As expected, predictive uncertainty quickly increases in areas that are not populated by any training data.\n\n::: {.cell execution_count=3}\n``` {.julia .cell-code}\nla = Laplace(nn; likelihood=:regression, λ=λ, σ=σtrue)\nfit!(la, data)\nplot(la, X, y)\n```\n\n::: {.cell-output .cell-output-display execution_count=18}\n![](index_files/figure-commonmark/cell-4-output-1.svg){}\n:::\n:::\n\n\n### Binary Classification\n\n\n\nOnce again we jump straight to LA and refer to the [docs](https://www.paltmeyer.com/LaplaceRedux.jl/dev/tutorials/mlp/) for a complete worked example involving binary classification. In this case we need to specify `likelihood=:classification`. The plot below shows the resulting posterior predictive distributions as contours in the two-dimensional feature space: note how the **Plugin** Approximation on the left compares to the Laplace Approximation on the right.\n\n::: {.cell execution_count=5}\n``` {.julia .cell-code}\nla = Laplace(nn; likelihood=:classification, λ=λ)\nfit!(la, data)\n\n# Plot the posterior predictive distribution:\np_plugin = plot(la, X, ys; title=\"Plugin\", link_approx=:plugin, clim=(0,1))\np_laplace = plot(la, X, ys; title=\"Laplace\", clim=(0,1))\nplot(p_plugin, p_laplace, layout=(1,2), size=(1000,400))\n```\n\n::: {.cell-output .cell-output-display execution_count=20}\n![](index_files/figure-commonmark/cell-6-output-1.svg){}\n:::\n:::\n\n\n## 📢 JuliaCon 2022\n\nThis project was presented at JuliaCon 2022 in July 2022. See [here](https://pretalx.com/juliacon-2022/talk/Z7MXFS/) for details.\n\n## 🛠️ Contribute\n\nContributions are very much welcome! Please follow the [SciML ColPrac guide](https://github.com/SciML/ColPrac). You may want to start by having a look at any open issues. \n\n\n## ❎ Known Limitations\n\nThis library currently offers native support only for models composed and trained in Flux. It also still lacks out-of-the-box support for hyperparameter tuning. \n\n## 🎓 References\n\n",
"markdown": "```@meta\nCurrentModule = LaplaceRedux\n```\n\n# LaplaceRedux\n\nDocumentation for [LaplaceRedux.jl](https://github.com/pat-alt/LaplaceRedux.jl).\n\n\n\n`LaplaceRedux.jl` is a library written in pure Julia that can be used for effortless Bayesian Deep Learning trough Laplace Approximation (LA). In the development of this package I have drawn inspiration from this Python [library](https://aleximmer.github.io/Laplace/index.html#setup) and its companion [paper](https://arxiv.org/abs/2106.14806) [@daxberger2021laplace].\n\n## 🚩 Installation\n\nThe stable version of this package can be installed as follows:\n\n```{.julia}\nusing Pkg\nPkg.add(\"LaplaceRedux.jl\")\n```\n\nThe development version can be installed like so:\n\n```{.julia}\nusing Pkg\nPkg.add(\"https://github.com/pat-alt/LaplaceRedux.jl\")\n```\n\n## 🖥️ Basic Usage\n\n`LaplaceRedux.jl` can be used for any neural network trained in [`Flux.jl`](https://fluxml.ai/Flux.jl/dev/). Below we show basic usage examples involving two simple models for a regression and a classification task, respectively.\n\n### Regression\n\n\n\nA complete worked example for a regression model can be found in the [docs](https://www.paltmeyer.com/LaplaceRedux.jl/dev/tutorials/regression/). Here we jump straight to Laplace Approximation and take the pre-trained model `nn` as given. Then LA can be implemented as follows, where we specify the model `likelihood` and supply pre-determined values for the prior precision `λ` and the observational noise `σ`. The plot show the fitted values overlayed with a 95% confidence interval. As expected, predictive uncertainty quickly increases in areas that are not populated by any training data.\n\n::: {.cell execution_count=3}\n``` {.julia .cell-code}\nla = Laplace(nn; likelihood=:regression, λ=λ, σ=σtrue)\nfit!(la, data)\nplot(la, X, y)\n```\n\n::: {.cell-output .cell-output-display execution_count=4}\n![](index_files/figure-commonmark/cell-4-output-1.svg){}\n:::\n:::\n\n\n### Binary Classification\n\n\n\nOnce again we jump straight to LA and refer to the [docs](https://www.paltmeyer.com/LaplaceRedux.jl/dev/tutorials/mlp/) for a complete worked example involving binary classification. In this case we need to specify `likelihood=:classification`. The plot below shows the resulting posterior predictive distributions as contours in the two-dimensional feature space: note how the **Plugin** Approximation on the left compares to the Laplace Approximation on the right.\n\n::: {.cell execution_count=5}\n``` {.julia .cell-code}\nla = Laplace(nn; likelihood=:classification, λ=λ)\nfit!(la, data)\n\n# Plot the posterior predictive distribution:\np_plugin = plot(la, X, ys; title=\"Plugin\", link_approx=:plugin, clim=(0,1))\np_laplace = plot(la, X, ys; title=\"Laplace\", clim=(0,1))\nplot(p_plugin, p_laplace, layout=(1,2), size=(1000,400))\n```\n\n::: {.cell-output .cell-output-display execution_count=6}\n![](index_files/figure-commonmark/cell-6-output-1.svg){}\n:::\n:::\n\n\n## 📢 JuliaCon 2022\n\nThis project was presented at JuliaCon 2022 in July 2022. See [here](https://pretalx.com/juliacon-2022/talk/Z7MXFS/) for details.\n\n## 🛠️ Contribute\n\nContributions are very much welcome! Please follow the [SciML ColPrac guide](https://github.com/SciML/ColPrac). You may want to start by having a look at any open issues. \n\n\n## ❎ Known Limitations\n\nThis library currently offers native support only for models composed and trained in Flux. It also still lacks out-of-the-box support for hyperparameter tuning. \n\n## 🎓 References\n\n",
"supporting": [
"index_files"
],
Expand Down
378 changes: 189 additions & 189 deletions _freeze/docs/src/index/figure-commonmark/cell-4-output-1.svg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading

0 comments on commit d1b2709

Please sign in to comment.