Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature Request] Allow complete group override while Inheriting other Configs #2993

Open
qsh-zh opened this issue Nov 22, 2024 · 1 comment
Labels
enhancement Enhanvement request

Comments

@qsh-zh
Copy link

qsh-zh commented Nov 22, 2024

🚀 Feature Request

When running multiple experiments with hierarchical configurations, there's a need to completely override specific config groups (like optimizer settings) while preserving other non-group configurations. Currently, Hydra merges all configurations, which can lead to unwanted parameter inheritance.

Here is concrete example

demo/train.py
---
import hydra
from omegaconf import DictConfig, OmegaConf

@hydra.main(version_base=None, config_path="conf", config_name="config")
def main(cfg: DictConfig) -> None:
    print("\n=== Configuration Details ===")
    print("\nFull config:")
    print(OmegaConf.to_yaml(cfg))

if __name__ == "__main__":
    main()

---
demo/conf/config.yaml
---
# @package _global_

defaults:
  - _self_
  - optim: null

  - experiment: null

---
demo/conf/optim/adam.yaml
---
lr: 0.001
betas: [0.9, 0.999]
eps: 1e-8
weight_decay: 0.0


---
demo/conf/optim/sgd.yaml
---
lr: 0.01
momentum: 0.9


---
demo/conf/experiment/first.yaml
---
# @package _global_

defaults:
  - override /optim: adam

optim:
  lr: 0.002  # override default lr
  weight_decay: 0.01
  eps: 1e-4
  extra_param: "this_should_be_deleted_in_second_experiment"


other_important_param: "this_should_be_not_deleted_in_second_experiment"

---
demo/conf/experiment/second.yaml
---
# @package _global_

defaults:
  - /experiment/first
  - override /optim: sgd


optim:
  lr: 0.05  # Only keep SGD-specific params
  momentum: 0.9


---

If I run first experiment

# python train.py experiment=first
optim:
  lr: 0.002
  betas:
  - 0.9
  - 0.999
  eps: 0.0001
  weight_decay: 0.01
  extra_param: this_should_be_deleted_in_second_experiment
other_important_param: this_should_be_not_deleted_in_second_experiment
If I run second experiment
# python train.py experiment=second
optim:
  lr: 0.05
  momentum: 0.9
  weight_decay: 0.01
  eps: 0.0001
  extra_param: this_should_be_deleted_in_second_experiment
other_important_param: this_should_be_not_deleted_in_second_experiment

what do I expected

  1. in the second experiment, we keep changes other_important_param
  2. but we can totally override optim from first experiment, which means keys extra_param and weight_decay and eps should be removed.
# python train.py experiment=second
optim:
  lr: 0.05
  momentum: 0.9
other_important_param: this_should_be_not_deleted_in_second_experiment

Motivation

See above example

Pitch

Are you willing to open a pull request? (See CONTRIBUTING)

Additional context

Add any other context or screenshots about the feature request here.

@qsh-zh qsh-zh added the enhancement Enhanvement request label Nov 22, 2024
@qsh-zh qsh-zh changed the title [Feature Request] Allow Complete Group Override While Inheriting Non-Group Configs in Hierarchical Experiments [Feature Request] Allow complete group override while Inheriting other Configs Nov 22, 2024
@qsh-zh
Copy link
Author

qsh-zh commented Nov 24, 2024

related discussion: #2956 (comment)

@jesszzzz @omry @Jasha10

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement Enhanvement request
Projects
None yet
Development

No branches or pull requests

1 participant