-
Notifications
You must be signed in to change notification settings - Fork 126
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add highlevel.crop_edf function #196
Conversation
Sorry for the late reply, much to do right now This might be due to rounding errors that get introduced by converting forth and back from digital to analog signal space. EDF saves files in digital space (i.e. amplifier integers), and then converts to floats (i.e. Voltage) as output, depending on the digital range Try loading and saving the data with I'm flying to a conference to the US soon, so will only have time to look at it in May. If you want, you can also wait for my further input until then, but knowing that you're a good dev (we'll be citing YASA in our upcoming publication!😊), I assume your code will be good anyway :) |
Using |
Hi @skjerns! Haha, that's kind of you but I would not trust myself too much as I've never dive into the
Yep works perfectly, thanks! I've added unit tests and everything is running smoothly on my machine ✅ |
pyedflib/tests/test_highlevel.py
Outdated
highlevel.crop_edf( | ||
edf_file, new_file=outfile, new_start=new_start, new_stop=new_stop) | ||
highlevel.crop_edf( | ||
edf_file, new_file=outfile, new_start=None, new_stop=new_stop, verbose=False) | ||
highlevel.crop_edf( | ||
edf_file, new_file=outfile, new_start=new_start, new_stop=None) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you add tests here that read the files and check if the data is actually the cropped portion we would expect? The startdatetime and stop are already checked within the function.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done
My apologies that this took so long. The code looks good. thank you a lot for this addition! One more request: It would be great if cropping could be done with relative times as well. My suggestion would be to have three parameters What are your thoughts on this? |
Thanks for the review @skjerns. Sounds good about adding the possibility to crop based on samples or seconds. For simplicity, what about: highlevel.crop_edf(
"data/test_generator.edf",
start=new_start, stop=new_stop,
start_format="datetime", stop_format="datetime",
verbose=True
) where |
Yes that sounds like a good solution! |
Ready for re-review @skjerns ! Of note, I did not add support for "samples" since I think it could lead to errors when the signals have different sampling frequencies. The only supported options are "datetime" and "seconds". I have also updated the unit tests, which now compare that the signal values and headers are the same. |
ah, yes that makes total sense. Thanks for thinking of this! I'll review it soon. |
Co-authored-by: Simon Kern <[email protected]>
Co-authored-by: Simon Kern <[email protected]>
Co-authored-by: Simon Kern <[email protected]>
Co-authored-by: Simon Kern <[email protected]>
Co-authored-by: Simon Kern <[email protected]>
All suggestions merged. Thanks! |
Thanks a lot for the contribution @raphaelvallat ! |
As discussed in #195, this adds a new function to crop the EDF file.
I'm still working on the unit tests. The following works like a charm:
However, when I set both
new_start
andnew_stop
to None, I should get True withcompare_edf(original, cropped)
, but instead got the following error:Do you have any idea why that might be? I'm not changing any of the signal headers 🤔
PS: I would recommend using black for code formatting. I usually set a max line length to 100 instead of the default 80 because I find it more readable. If that's something you'd be interested in let me know and I can submit a second PR after this one.
Thanks,
Raphael