-
Notifications
You must be signed in to change notification settings - Fork 13
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[WIP] Include alignment steps in coffeine pipeline #58
base: main
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It seems tests are missing
coffeine/transfer_learning.py
Outdated
Parameters | ||
---------- | ||
metric : str, default='riemann' | ||
The metric to compute the mean. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Indent issue
coffeine/transfer_learning.py
Outdated
def __init__(self, domains, metric='riemann'): | ||
self.domains = domains | ||
self.metric = metric | ||
self.re_center = TLCenter('target_domain', metric=self.metric) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This should not be put in the init
Init should just set the init parameters as attributes
coffeine/transfer_learning.py
Outdated
def fit(self, X, y): | ||
X = _check_data(X) | ||
_, y_enc = encode_domains(X, y, self.domains) | ||
self.means = self.re_center.fit(X, y_enc).recenter_ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
re_center_ as it's set and modified in the fit
And means_
coffeine/transfer_learning.py
Outdated
self.metric = metric | ||
self.re_scale = TLStretch('target_domain', | ||
center_data=True, | ||
metric=self.metric) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Idem
coffeine/transfer_learning.py
Outdated
y: ndarray, shape (n_matrices,) | ||
Labels for each matrix. | ||
domains: ndarray, shape (n_matrices,) | ||
Domains for each matrix. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Domains is not a param
coffeine/pipelines.py
Outdated
else: | ||
steps = [ | ||
projection(**projection_params_) | ||
] + alignment_steps + [ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we be sure that this not buggy? We need to make sure that for every column (‘name’) independent instances are instantiated. Otherwise you risk some stateful behavior.
coffeine/pipelines.py
Outdated
if 're-center' in alignment: | ||
alignment_steps.append(ReCenter(domains=domains)) | ||
if 're-scale' in alignment: | ||
alignment_steps.append(ReScale(domains=domains)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
See comment above. We would use the same ReCenter/ReScale instances for every column of the data frame based on this code here, right?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't understand, isn't it the same thing as what is done a few lines above for the projection and vectorization steps?
Also in practice I checked and each frequency band is aligned independently.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It’s not the same as we instantiate the classes independently for every column. We can check of course if this is practically unneccessary by now because of internal cloning etc. But if you want to be consistent with the tested / save code you can just adopt the same pattern as for the other projection/vectorization steps. You currently instantiate the classes globally and plug in the same input column by column.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ok I see, let's do it as projection/vectorization.
Some missing milestones:
|
Test failing is related to this issue in pyRiemann: pyRiemann/pyRiemann#257 |
@apmellot what’s your philosophy here? Fixing it in PyRiemann and waiting for that? Implementing a patch locally and then use PyRiemann in the future once it is fixed there? |
The PR is open in pyRiemann, we can wait for it to be merged. |
We will need a patch to support earlier versions of PyRiemann - otherwise we depend on master branch. We can add a checker that sees the pyriemann version and depending on the version supplies our patched function or uses the upstream one. |
For now, I only added the re-centering and the re-scaling step between projection and vectorization. I tested it with TUAB and LEMON and the results are as expected.
Next, I want to add the rotation correction step still using pyRiemann, create an example notebook and make nice docstrings.