Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can't train. #164

Closed
DarkAlchy opened this issue Dec 23, 2024 · 0 comments
Closed

Can't train. #164

DarkAlchy opened this issue Dec 23, 2024 · 0 comments

Comments

@DarkAlchy
Copy link

DarkAlchy commented Dec 23, 2024

It sees my 4 audio files, then I go into wandb, then it says

GPU available: True (cuda), used: True
TPU available: False, using: 0 TPU cores
IPU available: False, using: 0 IPUs
HPU available: False, using: 0 HPUs
You are using a CUDA device ('NVIDIA GeForce RTX 4090') that has Tensor Cores. To properly utilize them, you should set torch.set_float32_matmul_precision('medium' | 'high') which will trade-off precision for performance. For more details, read https://pytorch.org/docs/stable/generated/torch.set_float32_matmul_precision.html#torch.set_float32_matmul_precision
LOCAL_RANK: 0 - CUDA_VISIBLE_DEVICES: [0]

| Name | Type | Params

0 | diffusion | ConditionedDiffusionModelWrapper | 1.2 B
1 | diffusion_ema | EMA | 1.1 B
2 | losses | MultiLoss | 0

1.1 B Trainable params
1.2 B Non-trainable params
2.3 B Total params
9,080.665 Total estimated model params size (MB)
venv\lib\site-packages\pytorch_lightning\utilities\data.py:104: Total length of DataLoader across ranks is zero. Please make sure this was your intention.
venv\lib\site-packages\pytorch_lightning\utilities\data.py:104: Total length of CombinedLoader across ranks is zero. Please make sure this was your intention.
Trainer.fit stopped: No training batches.
wandb: Waiting for W&B process to finish... (success).
wandb: View run proud-feather-1 at: https://wandb.ai/XXXX/harmonai_train/runs/waq1kjin
wandb: Synced 5 W&B file(s), 0 media file(s), 2 artifact file(s) and 0 other file(s)
wandb: Find logs at: .\wandb\run-20241223_034540-waq1kjin\logs

No training batches?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant