Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Explicit model specification #68

Merged
merged 20 commits into from
May 20, 2021
Merged
Changes from 1 commit
Commits
Show all changes
20 commits
Select commit Hold shift + click to select a range
a6b9de9
move actions from core, stop from being a provider module
wilsonmr Apr 28, 2021
5b6b28d
add new type of model, which is a sequence of other models, allowing …
wilsonmr Apr 28, 2021
7b6b3c5
remove duplicate import
wilsonmr Apr 29, 2021
4122eb7
only allow explicit specification of models
wilsonmr Apr 29, 2021
4dbe1a4
move sequential to layers
wilsonmr Apr 30, 2021
8be7987
update module names, cleanup some docs warnings
wilsonmr Apr 30, 2021
2e3d670
remove spurious comment in train
wilsonmr Apr 30, 2021
76cb8ea
flatten out the inner layers in model_to_load
wilsonmr Apr 30, 2021
4d287db
might as well just call the model params input model
wilsonmr Apr 30, 2021
3a1232e
add basic layers tests
wilsonmr Apr 30, 2021
4f68d58
model tests, probably could be improved but at least have coverage.
wilsonmr Apr 30, 2021
20a41de
renamed coupling_pair to coupling_block
jmarshrossney May 14, 2021
af77d41
layer actions for batch norm and global rescaling
jmarshrossney May 14, 2021
3b52d3c
update example runcards
jmarshrossney May 14, 2021
5bfc87a
remove default scale factor for global rescaling layer
jmarshrossney May 14, 2021
6e44773
add epsilon and remove shift from data standardisation
jmarshrossney May 14, 2021
ff02307
update tests
jmarshrossney May 14, 2021
f147afa
add test for independence of rescaling layers, including breaking exa…
jmarshrossney May 14, 2021
8abf747
update layer independence test for generic layers
jmarshrossney May 14, 2021
3e50d20
raise AssertionError instead of custom exception
jmarshrossney May 14, 2021
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
66 changes: 53 additions & 13 deletions anvil/tests/test_models.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,11 @@
from anvil.api import API
from anvil.models import LAYER_OPTIONS


class LayersNotIndependentError(Exception):
pass


LAYERS = list(LAYER_OPTIONS.keys())

PARAMS = {
Expand Down Expand Up @@ -59,25 +64,31 @@ def test_model_construction(layer_idx, n_blocks, lattice_length_half, hidden_sha

def layer_independence_test(model_spec):
"""Check that each layer's parameters are updated independently."""

# Collect over these layers
model = API.model_to_load(**model_spec)
layer1, layer2 = [layer for layer in model]

layer2_copy = deepcopy(layer2)
model = iter(API.model_to_load(**model_spec))
model_copy = deepcopy(model)

# Update parameters in first layer
valid_key, valid_tensor = next(iter(layer1.state_dict().items()))
update = {valid_key: torch.rand_like(valid_tensor)}
layer1 = next(model)
update = {}
for valid_key, valid_tensor in layer1.state_dict().items():
update[valid_key] = torch.rand_like(valid_tensor)
layer1.load_state_dict(update, strict=False)

# Check that second layer is unchanged
# Check that this is different from the copy
layer1_copy = next(model_copy)
for original, copy in zip(layer1.parameters(), layer1_copy.parameters()):
assert not torch.allclose(original, copy)

# Now check that the other layers are unchanged
# NOTE: may be safer to iterate over shared keys
for original, copy in zip(layer2.parameters(), layer2_copy.parameters()):
assert torch.allclose(original, copy)
for layer, layer_copy in zip(model, model_copy):
for original, copy in zip(layer.parameters(), layer_copy.parameters()):
if not torch.allclose(original, copy):
raise LayersNotIndependentError(
"Parameters are being shared amongst layers that should be independent."
Copy link
Owner Author

@wilsonmr wilsonmr May 14, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

was there a reason you couldn't except expect an assertion error?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

no

)


# TODO: extend to other layers... @pytest.mark.parametrize("layer_action", LAYERS)
@torch.no_grad()
def test_layer_independence_global_rescaling():
# Build a model with two identical sets of layers
Expand All @@ -96,4 +107,33 @@ def test_layer_independence_global_rescaling():
],
"scale": 1.0,
}
layer_independence_test(breaking_example)
with pytest.raises(LayersNotIndependentError):
layer_independence_test(breaking_example)


# TODO: could extend to all layers quite easily
@torch.no_grad()
def test_layer_independence_additive():
params = {
"hidden_shape": (32,),
"n_blocks": 1,
"lattice_length": 6,
"lattice_dimension": 2,
}
working_example = {
"model": [
{"layer": "nice", **params},
{"layer": "nice", **params},
]
}
layer_independence_test(working_example)

breaking_example = {
"model": [
{"layer": "nice"},
{"layer": "nice"},
],
**params,
}
with pytest.raises(LayersNotIndependentError):
layer_independence_test(breaking_example)