Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs: update and cleanup tutorials and examples #815

Merged
merged 14 commits into from
Mar 14, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .buildkite/pipeline.yml
Original file line number Diff line number Diff line change
Expand Up @@ -55,7 +55,7 @@ steps:
queue: "juliagpu"
cuda: "*"
if: build.message !~ /\[skip docs\]/ && !build.pull_request.draft
timeout_in_minutes: 1000
timeout_in_minutes: 1440

env:
SECRET_DOCUMENTER_KEY: "yBJHrf5zUPu3VFotb0W4TRdauMTGgNUei3ax0xrVTmcrrP3EX8zSGaj9MNZji9H6JqyNEhCqZMGcHhxR5XK0f97YjhAp5rDNpwV6FbIXY8FXgFyIOLAenYUklta1W6fNM7KTz3Dq3UnKBAprKhQBwUw3SldTuvtm+IhiVT2rFmADqxMSQcv+5LivYEgAFrrd6LX+PHhonj38VN46z5Bi3TOIGAnByVlZssX7cOwaRg/8TloLPsWAUlQiPr89Vdlow9k6SQV8W9mf00/Rq4LFd1Eq1EYTCSmReVawMxVpXh1mj7MSabf9ppVWYOmjP5Rzatk8YFWlJf80HVvzY7tXRQ==;U2FsdGVkX1/UmKrEQkyqoZY6GckLGmpV5SsPa2byZhZbomiJKIrLntTIHK9GclgfCJ1MDCKhuo3hMVxq+NKVm0QmnMZk0Hlzszx8Up5pbcuSb2BA0Wm7IJzZ5/2uXfdBOXIejwk+GUabJtSpiDlYJUW/8bOWYZUwl9rmrOM44tPesEcs5yMZPShIowZnJqrvQUcWXZ/+aZjftX+Pe7nP0KddzYRYzhIJVzYmU394V9MWqZDqaNU19TJqnL8dNQxFLfyDibXuCqSD9QjV2I/iCfwr1xI81h11vmaatpZCDUUGYyxLRh1w5BuT1hHvLqktoAGliBD2lSOEB5KbwJu0nQCbEXPsM6CpMb4JygFloViQJ1L0kb91JEZIrnP6ruM1rrgrRu8Ufb+hob+BsCiEhSxwLy6ynOll2ZggMU/BVjQjkVQ3TmxBLvJ4T3QwzCSPpyN6lyHgSWNodN2Fn+PaWd0Sc0gSs8BY9PmOOc9m5ErVCrsscTr7aqcIbLZBnNA3y+mtLN5Vpuk3bmv9Bb9SAllXMLFFd604wtlxJi2Ecrb+6b4rc+QUmr5gnYqOYLCCOruFJfvMY63eFuuHWHKT+qroiGeuUhprUJDZUzhKUKnFFhhUZKtI11RAhVikZMPyMAsW4+gshQkEjXTRZqBpadpMh+c0uO8V2tRZAPIn7GXSdsCaGWbzL6yVZx79mM0W4sDKAp0Xs2wc04FgPbDu2sHGA+VxokrGctRovGVhSELx65aAj7x/ByhYdIByPCxHa7TNBUHb7n/4XLw8KIzVKr6jX2ep5m3KlYdjI7uxq8Hlpeu0hCRG3tdCqwOZsEOm3yhC3B/jsrKLzOsDP/x3ByAp8RvSVPP9WsWP55CxZUvc+q5LiVXBc7PhUC4nRB5/FMykjm6PboB92Y0DkP8Wql09FDr3vs8B3TkVLzOvzcz888oZTQpTaoixaAlVBs51al4N7UXhp5BInUCUIkknIyAEzXgK/5SpzixVoEkeNPkrMqg4hDaSHlKu0VDuqcP0Uv/9IKf/qs0+XK+2v9QBgqV16upbHK2EptgII3QJpXf2sq/IQTPXq3aly3NnpPUcIZJZtfG/Wk7qOFg/NhBVMvXWkrTRwQhJ5VXFTP0kXVpbgBml4Eq/zw+tAn5mmtieVaeFonZgqCIa+gi+tWiMy2V3aTubEYUGWTb3WOtxMt9i3bu9KhvOIr+fwCgpYUG9Vb/6v7H84zt5HT59sNlo9J8Acih8EfVZseC5JVF6ugxfnHc8BBOtvuUUFtOjIWwOgcfCiPXvsZdMQh0Ow3u9pYyJUZ3s+enHkBwtsu3+kXBkeL77eggslREVUz2g9eK8G5sKwSCsltgv8HQbHYARkXqq14Unb3NNakvab7BrQ2znWy7zU4P04Thtqn2fOqiAOUxuGRF5iNnkSnIZavtataT6rezB1efn4V2HYANcR4JoGj1JBXWa/dOJAYVnRLB7pbpS94yjbEMjRMB5aWSnjf6FzaDnXwfUAjAHvHNZsLxVhtIejyPQhg2kbSftwWWqw9zVvc3dz2a18V+MyNakcRiRS0CEqSl/L8vIhTppGultqzJZCKJKMAIzUE5+Q1EeDYL1kEImtSZ3hVXq4YaSB4q/bNKbDLG4pWs7IO6DFR88j5KIPfMPy15jgP6v+w0QZh8JNPQLveXvmYU4T9vxalT8i1P/4Nc2tI1A8HUv0hVoNu6V0ugTxukMLJLLFrZ80Ic7xCxNzqlzzcMhsYOHhcR4fZCdDtmoNfZm066hAkRcQJ0yNbiv7GUxrybGzer+N+s7QtS7/AGxuVW1MNQlsqdwTL2KTOvkZWHYB5cHpfqeS6zSPczeEiOIgT1fRXv3kYRBaJ4Dk7aWwXuCBbN2vAhRCEjcJC6QXE4BUukycNacytP1+HhCBeouxdPli9jrPIz2YH0iy7kc47XPiJr7zR1KAza3M3boau6xeb/why5zV7gHzB08qAxhm+pBLm4ERdKSwe/KAdGX5M0hiqMLHceUwJgzWEgukJfntxeZnP1rFJnTEAbkBy/CKtEmdr4IJYIFZU59IE9WOeYgrzl677JoGblkJ2B1sj1U8DbsRHIN+muVdAmYu+PBft0Zxih4JNe/rgmC/hNpDClMEPIEk4euRLf3xl1OHCOcWfEKuhwV/wIwJ0dtQzjN97g6a9IbSlupLAwPDZM925hC7qYicuzrF0Nj64GeOSMb4WIiEGpgH8TWOYkgxea+RoNLu0MEErcz26jqnV1QS8YGEFtT8k0lnhivg+SXIcgdVMoyavFVjqP4FeNm0aL0vE5+odOzEIzKKVNvHqofO4HbrRmlbAorS9OfzRlHbzJznIWO+8yyQc6oKyeT92THh4+wFYXQzkw0bvHsJZ08OymCAnIP+dZCzOSJ/MzcI3BjmcMZcHasPS6CfgSRtm7o8GJvyZljfW/z4zdy6+HzGLxZcgiwDc4qODYBEMdSRdDrxzpENZ4/IK3JTTcafsrRgbqi1nYyadQQx5A5xFhyYZq04iaDxx8GmDV6v4l4Hp/iPHoG0mp3yjxrt5hUjEyLW/+5lZXCnBxIfEiV/vdZBXGJxZs3oiATTOjMQxQYbbcyEs02WnFqRsMxDwlTsUnhbnEPUb9vkQhJjkHAsOt2V7csH7ojwlJPCp9baWV6iqSbvtQu5sSwWItOJflxiw2NCVvvMhGjOQOb8sFf6yeAKHOi+2vk0T7Kkr5JziPZ1L2PDmiZHwMhjVwd2uIZP6pBuQtoyxxn6040nUX5QwHjVO7RamVmqOLoKJFQHYWlRs1FHSxK6BQWH/1MeSDvCBWfiFnm6wWUMWr9NKlSOMFPWEnVdQ+eu83yoSRVT0U7WNoSw/tyK1KB64DL6Z7VffWB1AvnMZ1uvuSFtkEjHOTrcmDkGzSDQs61qO8kmDiTqlpMDVbv7DmhiXpBAC2agwLw/xh1m3xoRNetTNxowMVRjokZCrevoPLCcxrRDl0tZz9g/fF2q9rMRIAEhwWTYC+WgX4KQ4Xj4BpFbx1d3G2oHEAIJItyIFHHoKzwKJl+InNJEdXZUCEpdE3IiI2agnbP9W/1sSRWKE1Ub0KukhK7GYBIwdnwx0GgqqLYTRNrA8PnKLkSQp4ri9XJRSxI52jqbMP/S3x2ogIbvyYeQXbCwS7jloEMSgDSAQcTPZpPEzR5tHZG/XMMYHZX/h+aARdsaQJq7UNoIAJ8zrwkWnjNKSqrpF3Wfn/uOCYHXsyHepa/4f9cf0mtklGa4nSZPV8nVa0jvXzt2lzmg7uur0/ysa8Wl2LAZpkcTLlZ1tbFrbdBbcibPGi4r0QYJ6BM60yjfarwq/WnFHY2BLIpJKOxxA/7ko6kZ05t+fe/IqZnkxHqH9PsdpCN63H+py0S3tMOijrULzRMNjalF1ywPNm2Ugl/EiBbx7wv2G30wMdk1pdKbjgiGdq2wF1nPb4fOGBSoFtk8USmDSno1/FnYRKCH4gcyV3x/vHflFUHlSbh5Aw43YT1wxASJt7lvPq/uSTVw8wVe0bavIW4Gzyk7Fds5fjEi0eyZRtCfAbPlsCQ6aybuZQD870vdT8bxc1sRTdjDRbtFy8PGQqRwR91MhqITLOT7FpptcwltMV9jsGAEXBS6EX754sT3hYLB9OK6INME0rAbHUmq9tvCknIAiH3LIwuJJHHBLEFVeYveTk/00iBHjvn4Yb9MYEPaiTgMcQwRz8khf1aWj/Vz16c2aOjdiZOBEDpxH5h5tJLAMmc8bHfVAhWqdC6hDaulAgHEuo5gcKbkIbWEX/jvuOJq0fO9EYz6eDwwPy6hGGB5pzasJIKUDvXRCh40f5Iy8DOcLJRh+KCxjD9zDqhQQZ8WC+9l1/brckKPw59n0F3Sh3c+nHfyYBBlmsONTmjUZTgRwqg92z+2YQWyUf7g5jmNtBEjLyXtNvn9HkZGXd9YVp7Ps10GklGQiKYZmWUl73KGtsfsRF+/SQ4kRumd4YnlC7b04w6tFRk3HMog+38OVZDwMj40unK4XJWYABJX0t2XySGlXrL8ZNW8xR/iCVsW6+4glxFvTeH5ujPUjQKFb/0bvbZeExeYnRECdDz6u3z/gQYZiUMA8NUNKJuQTzWV9nXyubOWEG9NuJZ4X7oaGE/DtWO3j57r8bcE9KdtV+DpGvKyS+lBrdxL5vlOJ3rX+PqWeIOkxKc85lKT/us8H054bVubQebl0Tc+rvyqZVM7MToHgDj9VwlEbfV6o02Em/J5JUh0GMCoJw6CX5rfHgAIPlhlh/wXRVj8FQcUiTSzDb8lpwXxGO9rNWNfgE9ZRduiXT5LnUYhf6BC5eomyvZ6DcqDABJyWIV7kejbT0TlspicebTzP/kMyTPrGM9TchZjMdv6bPp8jRf4lUGWBjD4i1Diris5aM="
1 change: 0 additions & 1 deletion .github/workflows/CI.yml
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,6 @@ jobs:
- AdaptiveLoss
- Logging
- Forward
- NeuralAdapter
- DGM
version:
- "1"
Expand Down
10 changes: 10 additions & 0 deletions docs/Project.toml
Original file line number Diff line number Diff line change
@@ -1,14 +1,18 @@
[deps]
AdvancedHMC = "0bf59076-c3b1-5ca4-86bd-e02cd72cde3d"
ComponentArrays = "b0b7db55-cfe3-40fc-9ded-d10e2dbeff66"
Cubature = "667455a9-e2ce-5579-9412-b964f529a492"
DiffEqBase = "2b5f629d-d688-5b77-993f-72d75c75574e"
Distributions = "31c24e10-a181-5473-b8eb-7969acd0382f"
Documenter = "e30172f5-a6a5-5a46-863b-614d45cd2de4"
DomainSets = "5b8099bc-c8ec-5219-889f-1d9e522a28bf"
Flux = "587475ba-b771-5e3f-ad9e-33799f191a9c"
Integrals = "de52edbc-65ea-441a-8357-d3a637375a31"
LineSearches = "d3d80556-e9d4-5f37-9878-2ab0fcc64255"
LinearAlgebra = "37e2e46d-f89d-539d-b4ee-838fcccc9c8e"
Lux = "b2108857-7c20-44ae-9111-449ecde12c47"
LuxCUDA = "d0bbae9a-e099-4d5b-a835-1c6931763bda"
MethodOfLines = "94925ecb-adb7-4558-8ed8-f975c56a0bf4"
ModelingToolkit = "961ee093-0014-501f-94e3-6117800e7a78"
MonteCarloMeasurements = "0987c9cc-fe09-11e8-30f0-b96dd679fdca"
NeuralPDE = "315f7962-48a3-4962-8226-d0f33b1235f0"
Expand All @@ -19,19 +23,24 @@ OptimizationPolyalgorithms = "500b13db-7e66-49ce-bda4-eed966be6282"
OrdinaryDiffEq = "1dea7af3-3e70-54e6-95c3-0bf5283fa5ed"
Plots = "91a5bcdd-55d7-5caf-9e0b-520d859cae80"
QuasiMonteCarlo = "8a4e6c94-4038-4cdc-81c3-7e6ffdb2a71b"
Random = "9a3f8284-a2c9-5f02-9a11-845980a1fd5c"
Roots = "f2b01f46-fcfa-551c-844a-d8ac1e96c665"
SpecialFunctions = "276daf66-3868-5448-9aa4-cd146d93841b"

[compat]
AdvancedHMC = "0.6"
ComponentArrays = "0.15"
Cubature = "1.5"
DiffEqBase = "6.106"
Distributions = "0.25"
Documenter = "1"
DomainSets = "0.6, 0.7"
Flux = "0.14"
Integrals = "4"
LineSearches = "7.2"
Lux = "0.5"
LuxCUDA = "0.3"
MethodOfLines = "0.10"
ModelingToolkit = "8.33"
MonteCarloMeasurements = "1"
NeuralPDE = "5.3"
Expand All @@ -42,5 +51,6 @@ OptimizationPolyalgorithms = "0.2"
OrdinaryDiffEq = "6.31"
Plots = "1.36"
QuasiMonteCarlo = "0.3"
Random = "1"
Roots = "2.0"
SpecialFunctions = "2.1"
4 changes: 1 addition & 3 deletions docs/src/examples/3rd.md
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@ callback = function (p, l)
return false
end

res = Optimization.solve(prob, Adam(0.01); callback = callback, maxiters = 2000)
res = Optimization.solve(prob, OptimizationOptimisers.Adam(0.01); maxiters = 2000)
phi = discretization.phi
```

Expand All @@ -67,5 +67,3 @@ x_plot = collect(xs)
plot(x_plot, u_real, title = "real")
plot!(x_plot, u_predict, title = "predict")
```

![hodeplot](https://user-images.githubusercontent.com/12683885/90276340-69bc3e00-de6c-11ea-89a7-7d291123a38b.png)
2 changes: 1 addition & 1 deletion docs/src/examples/heterogeneous.md
Original file line number Diff line number Diff line change
Expand Up @@ -45,5 +45,5 @@ callback = function (p, l)
return false
end
res = Optimization.solve(prob, BFGS(); callback = callback, maxiters = 100)
res = Optimization.solve(prob, BFGS(); maxiters = 100)
```
4 changes: 1 addition & 3 deletions docs/src/examples/ks.md
Original file line number Diff line number Diff line change
Expand Up @@ -72,7 +72,7 @@ callback = function (p, l)
end
opt = OptimizationOptimJL.BFGS()
res = Optimization.solve(prob, opt; callback = callback, maxiters = 2000)
res = Optimization.solve(prob, opt; maxiters = 2000)
phi = discretization.phi
```

Expand All @@ -93,5 +93,3 @@ p2 = plot(xs, u_real, title = "analytic")
p3 = plot(xs, diff_u, title = "error")
plot(p1, p2, p3)
```

![plotks](https://user-images.githubusercontent.com/12683885/91025889-a6253200-e602-11ea-8f61-8e6e2488e025.png)
27 changes: 16 additions & 11 deletions docs/src/examples/linear_parabolic.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,10 +23,10 @@ w(t, 1) = \frac{e^{\lambda_1} cos(\frac{x}{a})-e^{\lambda_2}cos(\frac{x}{a})}{\l

with a physics-informed neural network.

```@example
using NeuralPDE, Lux, ModelingToolkit, Optimization, OptimizationOptimJL
```@example linear_parabolic
using NeuralPDE, Lux, ModelingToolkit, Optimization, OptimizationOptimisers, OptimizationOptimJL, LineSearches
using Plots
import ModelingToolkit: Interval, infimum, supremum
using ModelingToolkit: Interval, infimum, supremum

@parameters t, x
@variables u(..), w(..)
Expand Down Expand Up @@ -71,7 +71,7 @@ input_ = length(domains)
n = 15
chain = [Lux.Chain(Dense(input_, n, Lux.σ), Dense(n, n, Lux.σ), Dense(n, 1)) for _ in 1:2]

strategy = QuadratureTraining()
strategy = StochasticTraining(500)
discretization = PhysicsInformedNN(chain, strategy)

@named pdesystem = PDESystem(eqs, bcs, domains, [t, x], [u(t, x), w(t, x)])
Expand All @@ -85,14 +85,14 @@ global iteration = 0
callback = function (p, l)
if iteration % 10 == 0
println("loss: ", l)
println("pde_losses: ", map(l_ -> l_(p), pde_inner_loss_functions))
println("bcs_losses: ", map(l_ -> l_(p), bcs_inner_loss_functions))
println("pde_losses: ", map(l_ -> l_(p.u), pde_inner_loss_functions))
println("bcs_losses: ", map(l_ -> l_(p.u), bcs_inner_loss_functions))
end
global iteration += 1
return false
end

res = Optimization.solve(prob, BFGS(); callback = callback, maxiters = 5000)
res = Optimization.solve(prob, OptimizationOptimisers.Adam(1e-2); maxiters = 10000)

phi = discretization.phi

Expand All @@ -105,14 +105,19 @@ analytic_sol_func(t, x) = [u_analytic(t, x), w_analytic(t, x)]
u_real = [[analytic_sol_func(t, x)[i] for t in ts for x in xs] for i in 1:2]
u_predict = [[phi[i]([t, x], minimizers_[i])[1] for t in ts for x in xs] for i in 1:2]
diff_u = [abs.(u_real[i] .- u_predict[i]) for i in 1:2]
ps = []
for i in 1:2
p1 = plot(ts, xs, u_real[i], linetype = :contourf, title = "u$i, analytic")
p2 = plot(ts, xs, u_predict[i], linetype = :contourf, title = "predict")
p3 = plot(ts, xs, diff_u[i], linetype = :contourf, title = "error")
plot(p1, p2, p3)
savefig("sol_u$i")
push!(ps, plot(p1, p2, p3))
end
```

![linear_parabolic_sol_u1](https://user-images.githubusercontent.com/26853713/125745625-49c73760-0522-4ed4-9bdd-bcc567c9ace3.png)
![linear_parabolic_sol_u2](https://user-images.githubusercontent.com/26853713/125745637-b12e1d06-e27b-46fe-89f3-076d415fcd7e.png)
```@example linear_parabolic
ps[1]
```

```@example linear_parabolic
ps[2]
```
31 changes: 16 additions & 15 deletions docs/src/examples/nonlinear_elliptic.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,10 +26,10 @@ where k is a root of the algebraic (transcendental) equation f(k) = g(k).

This is done using a derivative neural network approximation.

```@example
```@example nonlinear_elliptic
using NeuralPDE, Lux, ModelingToolkit, Optimization, OptimizationOptimJL, Roots
using Plots
import ModelingToolkit: Interval, infimum, supremum
using ModelingToolkit: Interval, infimum, supremum

@parameters x, y
Dx = Differential(x)
Expand Down Expand Up @@ -79,18 +79,14 @@ input_ = length(domains)
n = 15
chain = [Lux.Chain(Dense(input_, n, Lux.σ), Dense(n, n, Lux.σ), Dense(n, 1)) for _ in 1:6] # 1:number of @variables

strategy = QuadratureTraining()
strategy = GridTraining(0.01)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We shouldn't default to grid training.

discretization = PhysicsInformedNN(chain, strategy)

vars = [u(x, y), w(x, y), Dxu(x, y), Dyu(x, y), Dxw(x, y), Dyw(x, y)]
@named pdesystem = PDESystem(eqs_, bcs__, domains, [x, y], vars)
prob = NeuralPDE.discretize(pdesystem, discretization)
sym_prob = NeuralPDE.symbolic_discretize(pdesystem, discretization)

strategy = NeuralPDE.QuadratureTraining()
discretization = PhysicsInformedNN(chain, strategy)
sym_prob = NeuralPDE.symbolic_discretize(pdesystem, discretization)

pde_inner_loss_functions = sym_prob.loss_functions.pde_loss_functions
bcs_inner_loss_functions = sym_prob.loss_functions.bc_loss_functions[1:6]
aprox_derivative_loss_functions = sym_prob.loss_functions.bc_loss_functions[7:end]
Expand All @@ -99,15 +95,15 @@ global iteration = 0
callback = function (p, l)
if iteration % 10 == 0
println("loss: ", l)
println("pde_losses: ", map(l_ -> l_(p), pde_inner_loss_functions))
println("bcs_losses: ", map(l_ -> l_(p), bcs_inner_loss_functions))
println("der_losses: ", map(l_ -> l_(p), aprox_derivative_loss_functions))
println("pde_losses: ", map(l_ -> l_(p.u), pde_inner_loss_functions))
println("bcs_losses: ", map(l_ -> l_(p.u), bcs_inner_loss_functions))
println("der_losses: ", map(l_ -> l_(p.u), aprox_derivative_loss_functions))
end
global iteration += 1
return false
end

res = Optimization.solve(prob, BFGS(); callback = callback, maxiters = 5000)
res = Optimization.solve(prob, BFGS(); maxiters = 100)

phi = discretization.phi

Expand All @@ -120,14 +116,19 @@ analytic_sol_func(x, y) = [u_analytic(x, y), w_analytic(x, y)]
u_real = [[analytic_sol_func(x, y)[i] for x in xs for y in ys] for i in 1:2]
u_predict = [[phi[i]([x, y], minimizers_[i])[1] for x in xs for y in ys] for i in 1:2]
diff_u = [abs.(u_real[i] .- u_predict[i]) for i in 1:2]
ps = []
for i in 1:2
p1 = plot(xs, ys, u_real[i], linetype = :contourf, title = "u$i, analytic")
p2 = plot(xs, ys, u_predict[i], linetype = :contourf, title = "predict")
p3 = plot(xs, ys, diff_u[i], linetype = :contourf, title = "error")
plot(p1, p2, p3)
savefig("non_linear_elliptic_sol_u$i")
push!(ps, plot(p1, p2, p3))
end
```

![non_linear_elliptic_sol_u1](https://user-images.githubusercontent.com/26853713/125745550-0b667c10-b09a-4659-a543-4f7a7e025d6c.png)
![non_linear_elliptic_sol_u2](https://user-images.githubusercontent.com/26853713/125745571-45a04739-7838-40ce-b979-43b88d149028.png)
```@example nonlinear_elliptic
ps[1]
```

```@example nonlinear_elliptic
ps[2]
```
26 changes: 16 additions & 10 deletions docs/src/examples/nonlinear_hyperbolic.md
Original file line number Diff line number Diff line change
Expand Up @@ -32,11 +32,11 @@ where k is a root of the algebraic (transcendental) equation f(k) = g(k), j0 and

We solve this with Neural:

```@example
using NeuralPDE, Lux, ModelingToolkit, Optimization, OptimizationOptimJL, Roots
```@example nonlinear_hyperbolic
using NeuralPDE, Lux, ModelingToolkit, Optimization, OptimizationOptimJL, Roots, LineSearches
using SpecialFunctions
using Plots
import ModelingToolkit: Interval, infimum, supremum
using ModelingToolkit: Interval, infimum, supremum

@parameters t, x
@variables u(..), w(..)
Expand Down Expand Up @@ -94,12 +94,12 @@ bcs_inner_loss_functions = sym_prob.loss_functions.bc_loss_functions

callback = function (p, l)
println("loss: ", l)
println("pde_losses: ", map(l_ -> l_(p), pde_inner_loss_functions))
println("bcs_losses: ", map(l_ -> l_(p), bcs_inner_loss_functions))
println("pde_losses: ", map(l_ -> l_(p.u), pde_inner_loss_functions))
println("bcs_losses: ", map(l_ -> l_(p.u), bcs_inner_loss_functions))
return false
end

res = Optimization.solve(prob, BFGS(); callback = callback, maxiters = 1000)
res = Optimization.solve(prob, BFGS(linesearch = BackTracking()); maxiters = 200)

phi = discretization.phi

Expand All @@ -112,14 +112,20 @@ analytic_sol_func(t, x) = [u_analytic(t, x), w_analytic(t, x)]
u_real = [[analytic_sol_func(t, x)[i] for t in ts for x in xs] for i in 1:2]
u_predict = [[phi[i]([t, x], minimizers_[i])[1] for t in ts for x in xs] for i in 1:2]
diff_u = [abs.(u_real[i] .- u_predict[i]) for i in 1:2]
ps = []
for i in 1:2
p1 = plot(ts, xs, u_real[i], linetype = :contourf, title = "u$i, analytic")
p2 = plot(ts, xs, u_predict[i], linetype = :contourf, title = "predict")
p3 = plot(ts, xs, diff_u[i], linetype = :contourf, title = "error")
plot(p1, p2, p3)
savefig("nonlinear_hyperbolic_sol_u$i")
push!(ps, plot(p1, p2, p3))
end
```

![nonlinear_hyperbolic_sol_u1](https://user-images.githubusercontent.com/26853713/126457614-d19e7a4d-f9e3-4e78-b8ae-1e58114a744e.png)
![nonlinear_hyperbolic_sol_u2](https://user-images.githubusercontent.com/26853713/126457617-ee26c587-a97f-4a2e-b6b7-b326b1f117af.png)

```@example nonlinear_hyperbolic
ps[1]
```

```@example nonlinear_hyperbolic
ps[2]
```
22 changes: 6 additions & 16 deletions docs/src/examples/wave.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ Further, the solution of this equation with the given boundary conditions is pre

```@example wave
using NeuralPDE, Lux, Optimization, OptimizationOptimJL
import ModelingToolkit: Interval
using ModelingToolkit: Interval

@parameters t, x
@variables u(..)
Expand Down Expand Up @@ -81,8 +81,6 @@ p3 = plot(ts, xs, diff_u, linetype = :contourf, title = "error");
plot(p1, p2, p3)
```

![waveplot](https://user-images.githubusercontent.com/12683885/101984293-74a7a380-3c91-11eb-8e78-72a50d88e3f8.png)

## 1D Damped Wave Equation with Dirichlet boundary conditions

Now let's solve the 1-dimensional wave equation with damping.
Expand All @@ -101,7 +99,7 @@ with grid discretization `dx = 0.05` and physics-informed neural networks. Here,
```@example wave2
using NeuralPDE, Lux, ModelingToolkit, Optimization, OptimizationOptimJL
using Plots, Printf
import ModelingToolkit: Interval, infimum, supremum
using ModelingToolkit: Interval, infimum, supremum

@parameters t, x
@variables u(..) Dxu(..) Dtu(..) O1(..) O2(..)
Expand Down Expand Up @@ -159,14 +157,14 @@ bcs_inner_loss_functions = sym_prob.loss_functions.bc_loss_functions

callback = function (p, l)
println("loss: ", l)
println("pde_losses: ", map(l_ -> l_(p), pde_inner_loss_functions))
println("bcs_losses: ", map(l_ -> l_(p), bcs_inner_loss_functions))
println("pde_losses: ", map(l_ -> l_(p.u), pde_inner_loss_functions))
println("bcs_losses: ", map(l_ -> l_(p.u), bcs_inner_loss_functions))
return false
end

res = Optimization.solve(prob, BFGS(); callback = callback, maxiters = 2000)
res = Optimization.solve(prob, BFGS(); maxiters = 2000)
prob = remake(prob, u0 = res.u)
res = Optimization.solve(prob, BFGS(); callback = callback, maxiters = 2000)
res = Optimization.solve(prob, BFGS(); maxiters = 2000)

phi = discretization.phi[1]

Expand Down Expand Up @@ -214,11 +212,3 @@ p2 = plot(ts, xs, u_predict, linetype = :contourf, title = "predict");
p3 = plot(ts, xs, diff_u, linetype = :contourf, title = "error");
plot(p1, p2, p3)
```

We can see the results here:

![Damped_wave_sol_adaptive_u](https://user-images.githubusercontent.com/12683885/149665332-d4daf7d0-682e-4933-a2b4-34f403881afb.png)

Plotted as a line, one can see the analytical solution and the prediction here:

![1Dwave_damped_adaptive](https://user-images.githubusercontent.com/12683885/149665327-69d04c01-2240-45ea-981e-a7b9412a3b58.gif)
16 changes: 6 additions & 10 deletions docs/src/tutorials/constraints.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,9 +21,9 @@ p(-2.2) = p(2.2) = 0
with Physics-Informed Neural Networks.

```@example fokkerplank
using NeuralPDE, Lux, ModelingToolkit, Optimization, OptimizationOptimJL
using NeuralPDE, Lux, ModelingToolkit, Optimization, OptimizationOptimJL, LineSearches
using Integrals, Cubature
import ModelingToolkit: Interval, infimum, supremum
using ModelingToolkit: Interval, infimum, supremum
# the example is taken from this article https://arxiv.org/abs/1910.10503
@parameters x
@variables p(..)
Expand Down Expand Up @@ -78,15 +78,13 @@ aprox_derivative_loss_functions = sym_prob.loss_functions.bc_loss_functions

cb_ = function (p, l)
println("loss: ", l)
println("pde_losses: ", map(l_ -> l_(p), pde_inner_loss_functions))
println("bcs_losses: ", map(l_ -> l_(p), bcs_inner_loss_functions))
println("additional_loss: ", norm_loss_function(phi, p, nothing))
println("pde_losses: ", map(l_ -> l_(p.u), pde_inner_loss_functions))
println("bcs_losses: ", map(l_ -> l_(p.u), bcs_inner_loss_functions))
println("additional_loss: ", norm_loss_function(phi, p.u, nothing))
return false
end

res = Optimization.solve(prob, LBFGS(), callback = cb_, maxiters = 400)
prob = remake(prob, u0 = res.u)
res = Optimization.solve(prob, BFGS(), callback = cb_, maxiters = 2000)
res = Optimization.solve(prob, BFGS(linesearch = BackTracking()), callback = cb_, maxiters = 600)
```

And some analysis:
Expand All @@ -103,5 +101,3 @@ u_predict = [first(phi(x, res.u)) for x in xs]
plot(xs, u_real, label = "analytic")
plot!(xs, u_predict, label = "predict")
```

![fp](https://user-images.githubusercontent.com/12683885/129405830-3d00c24e-adf1-443b-aa36-6af0e5305821.png)
Loading
Loading