Skip to content

Commit

Permalink
[pre-commit.ci] auto fixes from pre-commit.com hooks
Browse files Browse the repository at this point in the history
for more information, see https://pre-commit.ci
  • Loading branch information
pre-commit-ci[bot] committed Nov 15, 2024
1 parent aaf5710 commit 99c5ad7
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion generation/maisi/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ We retrained several state-of-the-art diffusion model-based methods using our da
| 512x512x768 | [80,80,112], 8 patches | 4 | 55G | 904s | 48s |


The experiment was tested on A100 80G GPU.
The experiment was tested on A100 80G GPU.

During inference, the peak GPU memory usage happens during the autoencoder decoding latent features.
To reduce GPU memory usage, we can either increasing `autoencoder_tp_num_splits` or reduce `autoencoder_sliding_window_infer_size`.
Expand Down

0 comments on commit 99c5ad7

Please sign in to comment.