-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update plots #76
Update plots #76
Conversation
raise ConfigError("Seed is outside of appropriate range: [0, 2 ** 32]") | ||
return manual_bootstrap_seed | ||
|
||
def parse_cosh_fit_min_separation(self, n: int, training_geometry): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can this not just take the lattice_length
directly?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yep, good point
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I can replace training_geometry.length
with lattice_length
in this parse function, but not the production rule directly below. I'm sure by now I should understand this behaviour but alas, I forget.
Also, since we're moving towards removing training_context
(which contains the entire training runcard) in favour of things more akin to training_geometry
which just extract the specific parameters we need (#62), perhaps it does make sense to leave this as is.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
that's really weird, I don't understand that..
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I guess leave it for now.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
apologies, it doesn't work with either the parse or produce - the parse just worked because I was using the default value for cosh_fit_min_separation
in cosh_fit_window
..
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it's because that key is only present in the training context and so we could in theory put lattice length but we'd have to collect over the fit or write the key in the sampling runcard, idk why I didn't think about this earlier.
So definitely leave it for now and I will look into extracting parameters from the trained model in a more satisfactory way..
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
aka the issue you linked.
@@ -111,7 +119,7 @@ def magnetic_susceptibility(magnetization, abs_magnetization_squared): | |||
|
|||
|
|||
def magnetization_series(configs): | |||
return configs.sum(axis=1) | |||
return configs.sum(axis=1).numpy() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
would it make more sense to just cast the configs numpy earlier instead of multiple times on the observables?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
in other words, do we ever need configs
to be a tensor?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yeah it would. I mean I think it only happens the once if you discount the unused correlator calculation, but it would probably be better to cast to numpy at the end of the sampling.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd rather just leave this niggle until we're messing around inside sample.py
anyway, trying to generate configurations from each layer, if that's alright.
@@ -5,15 +5,18 @@ | |||
{@table_two_point_scalars@} | |||
## Two Point Correlator | |||
{@plot_two_point_correlator@} | |||
{@plot_two_point_correlator_error@} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
do we want to keep adding to this report, or at some point provide some more examples of individual actions or smaller reports?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah I have opened an issue which is almost the same as addressing this #78
Did not: