Skip to content

Commit

Permalink
Merge pull request #83 from JuliaTrustworthyAI/37-update-docs
Browse files Browse the repository at this point in the history
Some updates to the docs
  • Loading branch information
pat-alt authored Apr 10, 2024
2 parents 3e8115c + 8510999 commit 32b648e
Show file tree
Hide file tree
Showing 42 changed files with 15,343 additions and 15,535 deletions.
2 changes: 1 addition & 1 deletion Project.toml
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
name = "LaplaceRedux"
uuid = "c52c1a26-f7c5-402b-80be-ba1e638ad478"
authors = ["Patrick Altmeyer"]
version = "0.1.6"
version = "0.1.7"

[deps]
ChainRulesCore = "d360d2e6-b24c-11e9-a2a3-2a2ae2dbcce4"
Expand Down
376 changes: 188 additions & 188 deletions README_files/figure-commonmark/cell-4-output-1.svg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
1,747 changes: 876 additions & 871 deletions README_files/figure-commonmark/cell-7-output-1.svg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
5 changes: 3 additions & 2 deletions _freeze/docs/src/index/execute-results/md.json
Original file line number Diff line number Diff line change
@@ -1,7 +1,8 @@
{
"hash": "78c5dc332bd7b51e7b48569a1e1728f5",
"hash": "0b3babc9ba412d09f74672b1ac5c443d",
"result": {
"markdown": "---\ntitle: LaplaceRedux\n---\n\n\n```@meta\nCurrentModule = LaplaceRedux\n```\n\n\nDocumentation for [LaplaceRedux.jl](https://github.com/JuliaTrustworthyAI/LaplaceRedux.jl).\n\n\n# LaplaceRedux\n\n\n\n\n`LaplaceRedux.jl` is a library written in pure Julia that can be used for effortless Bayesian Deep Learning through Laplace Approximation (LA). In the development of this package I have drawn inspiration from this Python [library](https://aleximmer.github.io/Laplace/index.html#setup) and its companion [paper](https://arxiv.org/abs/2106.14806) [@daxberger2021laplace].\n\n## 🚩 Installation\n\nThe stable version of this package can be installed as follows:\n\n```{.julia}\nusing Pkg\nPkg.add(\"LaplaceRedux.jl\")\n```\n\nThe development version can be installed like so:\n\n```{.julia}\nusing Pkg\nPkg.add(\"https://github.com/JuliaTrustworthyAI/LaplaceRedux.jl\")\n```\n\n## 🖥️ Basic Usage\n\n`LaplaceRedux.jl` can be used for any neural network trained in [`Flux.jl`](https://fluxml.ai/Flux.jl/dev/). Below we show basic usage examples involving two simple models for a regression and a classification task, respectively.\n\n### Regression\n\n\n\nA complete worked example for a regression model can be found in the [docs](https://www.paltmeyer.com/LaplaceRedux.jl/dev/tutorials/regression/). Here we jump straight to Laplace Approximation and take the pre-trained model `nn` as given. Then LA can be implemented as follows, where we specify the model `likelihood`. The plot show the fitted values overlaid with a 95% confidence interval. As expected, predictive uncertainty quickly increases in areas that are not populated by any training data.\n\n::: {.cell execution_count=3}\n``` {.julia .cell-code}\nla = Laplace(nn; likelihood=:regression)\nfit!(la, data)\noptimize_prior!(la)\nplot(la, X, y; zoom=-5, size=(500,500))\n```\n\n::: {.cell-output .cell-output-display execution_count=4}\n![](index_files/figure-commonmark/cell-4-output-1.svg){}\n:::\n:::\n\n\n\n\n### Binary Classification\n\n\n\nOnce again we jump straight to LA and refer to the [docs](https://www.paltmeyer.com/LaplaceRedux.jl/dev/tutorials/mlp/) for a complete worked example involving binary classification. In this case we need to specify `likelihood=:classification`. The plot below shows the resulting posterior predictive distributions as contours in the two-dimensional feature space: note how the **Plugin** Approximation on the left compares to the Laplace Approximation on the right.\n\n::: {.cell execution_count=6}\n``` {.julia .cell-code}\ntheme(:lime)\n\nla = Laplace(nn; likelihood=:classification)\nfit!(la, data)\nla_untuned = deepcopy(la) # saving for plotting\noptimize_prior!(la; n_steps=500)\n\n# Plot the posterior predictive distribution:\nzoom=0\np_plugin = plot(la, X, ys; title=\"Plugin\", link_approx=:plugin, clim=(0,1))\np_untuned = plot(la_untuned, X, ys; title=\"LA - raw (λ=$(unique(diag(la_untuned.P₀))[1]))\", clim=(0,1), zoom=zoom)\np_laplace = plot(la, X, ys; title=\"LA - tuned (λ=$(round(unique(diag(la.P₀))[1],digits=2)))\", clim=(0,1), zoom=zoom)\nplot(p_plugin, p_untuned, p_laplace, layout=(1,3), size=(1700,400))\n```\n\n::: {.cell-output .cell-output-display execution_count=7}\n![](index_files/figure-commonmark/cell-7-output-1.svg){}\n:::\n:::\n\n\n## 📢 JuliaCon 2022\n\nThis project was presented at JuliaCon 2022 in July 2022. See [here](https://pretalx.com/juliacon-2022/talk/Z7MXFS/) for details.\n\n## 🛠️ Contribute\n\nContributions are very much welcome! Please follow the [SciML ColPrac guide](https://github.com/SciML/ColPrac). You may want to start by having a look at any open [issues](https://github.com/JuliaTrustworthyAI/LaplaceRedux.jl/issues). \n\n## 🎓 References\n\n",
"engine": "jupyter",
"markdown": "```@meta\nCurrentModule = LaplaceRedux\n```\n\n![](assets/wide_logo.png)\n\nDocumentation for [LaplaceRedux.jl](https://github.com/JuliaTrustworthyAI/LaplaceRedux.jl).\n\n\n# LaplaceRedux\n\n\n\n`LaplaceRedux.jl` is a library written in pure Julia that can be used for effortless Bayesian Deep Learning through Laplace Approximation (LA). In the development of this package I have drawn inspiration from this Python [library](https://aleximmer.github.io/Laplace/index.html#setup) and its companion [paper](https://arxiv.org/abs/2106.14806) [@daxberger2021laplace].\n\n## 🚩 Installation\n\nThe stable version of this package can be installed as follows:\n\n```{.julia}\nusing Pkg\nPkg.add(\"LaplaceRedux.jl\")\n```\n\nThe development version can be installed like so:\n\n```{.julia}\nusing Pkg\nPkg.add(\"https://github.com/JuliaTrustworthyAI/LaplaceRedux.jl\")\n```\n\n## 🏃 Getting Started\n\nIf you are new to Deep Learning in Julia or simply prefer learning through videos, check out this awesome YouTube [tutorial](https://www.youtube.com/channel/UCQwQVlIkbalDzmMnr-0tRhw) by [doggo.jl](https://www.youtube.com/@doggodotjl/about) 🐶. Additionally, you can also find a [video](https://www.youtube.com/watch?v=oWko8FRj_64) of my presentation at JuliaCon 2022 on YouTube. \n\n## 🖥️ Basic Usage\n\n`LaplaceRedux.jl` can be used for any neural network trained in [`Flux.jl`](https://fluxml.ai/Flux.jl/dev/). Below we show basic usage examples involving two simple models for a regression and a classification task, respectively.\n\n### Regression\n\n\n\nA complete worked example for a regression model can be found in the [docs](https://www.paltmeyer.com/LaplaceRedux.jl/dev/tutorials/regression/). Here we jump straight to Laplace Approximation and take the pre-trained model `nn` as given. Then LA can be implemented as follows, where we specify the model `likelihood`. The plot shows the fitted values overlaid with a 95% confidence interval. As expected, predictive uncertainty quickly increases in areas that are not populated by any training data.\n\n::: {.cell execution_count=3}\n``` {.julia .cell-code}\nla = Laplace(nn; likelihood=:regression)\nfit!(la, data)\noptimize_prior!(la)\nplot(la, X, y; zoom=-5, size=(500,500))\n```\n\n::: {.cell-output .cell-output-display execution_count=28}\n![](index_files/figure-commonmark/cell-4-output-1.svg){}\n:::\n:::\n\n\n\n\n### Binary Classification\n\n\n\nOnce again we jump straight to LA and refer to the [docs](https://www.paltmeyer.com/LaplaceRedux.jl/dev/tutorials/mlp/) for a complete worked example involving binary classification. In this case we need to specify `likelihood=:classification`. The plot below shows the resulting posterior predictive distributions as contours in the two-dimensional feature space: note how the **Plugin** Approximation on the left compares to the Laplace Approximation on the right.\n\n::: {.cell execution_count=6}\n``` {.julia .cell-code}\nla = Laplace(nn; likelihood=:classification)\nfit!(la, data)\nla_untuned = deepcopy(la) # saving for plotting\noptimize_prior!(la; n_steps=100)\n\n# Plot the posterior predictive distribution:\nzoom=0\np_plugin = plot(la, X, ys; title=\"Plugin\", link_approx=:plugin, clim=(0,1))\np_untuned = plot(la_untuned, X, ys; title=\"LA - raw (λ=$(unique(diag(la_untuned.P₀))[1]))\", clim=(0,1), zoom=zoom)\np_laplace = plot(la, X, ys; title=\"LA - tuned (λ=$(round(unique(diag(la.P₀))[1],digits=2)))\", clim=(0,1), zoom=zoom)\nplot(p_plugin, p_untuned, p_laplace, layout=(1,3), size=(1700,400))\n```\n\n::: {.cell-output .cell-output-display execution_count=31}\n![](index_files/figure-commonmark/cell-7-output-1.svg){}\n:::\n:::\n\n\n## 📢 JuliaCon 2022\n\nThis project was presented at JuliaCon 2022 in July 2022. See [here](https://pretalx.com/juliacon-2022/talk/Z7MXFS/) for details.\n\n## 🛠️ Contribute\n\nContributions are very much welcome! Please follow the [SciML ColPrac guide](https://github.com/SciML/ColPrac). You may want to start by having a look at any open [issues](https://github.com/JuliaTrustworthyAI/LaplaceRedux.jl/issues). \n\n## 🎓 References\n\n",
"supporting": [
"index_files"
],
Expand Down
Loading

2 comments on commit 32b648e

@pat-alt
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@JuliaRegistrator register

Release notes:

Updates to docs

@JuliaRegistrator
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Registration pull request created: JuliaRegistries/General/104624

Tagging

After the above pull request is merged, it is recommended that a tag is created on this repository for the registered package version.

This will be done automatically if the Julia TagBot GitHub Action is installed, or can be done manually through the github interface, or via:

git tag -a v0.1.7 -m "<description of version>" 32b648e5c45db291a45b8b932ee607caf846227a
git push origin v0.1.7

Please sign in to comment.