We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The README file gives an example of loading a pretrained SAE:
from sae import Sae sae = Sae.load_from_hub("EleutherAI/sae-llama-3-8b-32x", hookpoint="layers.10")
However if I swap sae-llama-3-8b-32x with one of the other SAE's on hugging face.
sae-llama-3-8b-32x
Code that generates the error:
from sae import Sae sae = Sae.load_from_hub("EleutherAI/sae-pythia-70m-deduped-32k", hookpoint="layers.10")
Error message:
--------------------------------------------------------------------------- FileNotFoundError Traceback (most recent call last) Cell In[8], line 6 2 from sae import Sae 4 # sae = Sae.load_from_hub("EleutherAI/sae-llama-3-8b-32x", hookpoint="layers.10") 5 # sae = Sae.load_from_hub("EleutherAI/sae-pythia-70m-32k", hookpoint="layers.10") ----> 6 sae = Sae.load_from_hub("EleutherAI/sae-pythia-70m-deduped-32k", hookpoint="layers.10") File my_python_environment/lib/python3.12/site-packages/sae/sae.py:122, in Sae.load_from_hub(name, hookpoint, device, decoder) 119 elif not repo_path.joinpath("cfg.json").exists(): 120 raise FileNotFoundError("No config file found; try specifying a layer.") --> 122 return Sae.load_from_disk(repo_path, device=device, decoder=decoder) File my_python_environment/lib/python3.12/site-packages/sae/sae.py:133, in Sae.load_from_disk(path, device, decoder) 124 @staticmethod 125 def load_from_disk( 126 path: Path | str, (...) 129 decoder: bool = True, 130 ) -> "Sae": 131 path = Path(path) --> 133 with open(path / "cfg.json", "r") as f: 134 cfg_dict = json.load(f) 135 d_in = cfg_dict.pop("d_in") FileNotFoundError: [Errno 2] No such file or directory: 'home_directory/.cache/huggingface/hub/models--EleutherAI--sae-pythia-70m-deduped-32k/snapshots/7a64bade597212176dfe0782f9d839b94f0addaf/layers.10/cfg.json'
The text was updated successfully, but these errors were encountered:
No branches or pull requests
The README file gives an example of loading a pretrained SAE:
However if I swap
sae-llama-3-8b-32x
with one of the other SAE's on hugging face.Code that generates the error:
Error message:
The text was updated successfully, but these errors were encountered: