Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add layer specific training to init flux train node? #97

Open
yogotatara3 opened this issue Nov 20, 2024 · 7 comments
Open

Add layer specific training to init flux train node? #97

yogotatara3 opened this issue Nov 20, 2024 · 7 comments

Comments

@yogotatara3
Copy link

Hello dear Kijai,

First of all, thank you so much for creating these amazing nodes and workflows! They’ve made LoRa training much more accessible and streamlined for me.

I recently came across an article discussing the concept of training specific layers with LoRa to save time and resources. As described in this article, you can target specific layer regions for different use cases. For instance, layers 2 and 7 are ideal for training faces. It’s incredible not only because you can selectively control which layers to train, but also because the resulting LoRa files are significantly smaller—enabling super-fast training sessions.

For reference, the FLUX model consists of 37 single and double-layer blocks, with 18 of them being double blocks, offering fine-grained control during training. You can check out all the displayed layers here:
All 37 layers.

Additionally, this Reddit article highlights how compact and efficient FLUX LoRa files can be, often smaller than 45MB at 128 dimensions.

Thank you again for your time and effort—it’s truly appreciated!

@yogotatara3 yogotatara3 changed the title Add layer specific training? Add layer specific training to init flux train node? Nov 20, 2024
@kijai
Copy link
Owner

kijai commented Nov 20, 2024

This is already possible with this node connected to the "block_args":

image

That would select the linear1 layer of the block 20 (the attention only basically)

And for example this would select that for blocks 7-20: lora_unet_single_blocks_(7-20)_linear1

@yogotatara3
Copy link
Author

Hey, thx for your fast answer!
Super cool, amazing that its integrated.
Now I can move my whole process to comfy!

can u also select double and single blocks specifically?

@kijai
Copy link
Owner

kijai commented Nov 20, 2024

Hey, thx for your fast answer!
Super cool, amazing that its integrated.
Now I can move my whole process to comfy!

can u also select double and single blocks specifically?

Yes just replace single with double in the string.

@yogotatara3
Copy link
Author

Blessing! thx mate

@yogotatara3
Copy link
Author

yogotatara3 commented Nov 21, 2024

Hello, hope ure good - I have a another question please:

Can I use multiple Block as comma sequence in one run?
For example lora_unet_single_blocks_(5,8,9,10)_linear1

And how can I change the number of repeats, like in Koyha through renaming the folder with number like 10_fldername?

Thx for taking the time

@yogotatara3 yogotatara3 reopened this Nov 21, 2024
@taek9333
Copy link

taek9333 commented Dec 3, 2024

I would rather say this, but I'm not sure: lora_unet_single_blocks_7_linear1, lora_unet_single_blocks_20_linear1

@taek9333
Copy link

taek9333 commented Dec 3, 2024

And how can I change the number of repeats, like in Koyha through renaming the folder with number like 10_fldername?

just here
Capture d’écran 2024-12-03 162730

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants