You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
While downloading GSE178610 GitHub action failing:
Trying GSE178610 (not a file) as accession...
Skipped 0 accessions. Starting now.
Processing accession 1 of 1: 'GSE178610'
/home/runner/work/_temp/4dfe2662-0b6d-4b4d-84ff-aad23b8dfa34.sh: line 1: 1810 Killed python geo_pipeline_script.py --namespace geo_recent --host *** --db *** --user *** -***
Error: Process completed with exit code 137.
I am not sure why this error is happening. I guess, it's because geofetch is creating to huge annotation dict or list.
Do you have any ideas how to handle this error and if it occurs how to skip this accession? @nleroy917@nsheff
The text was updated successfully, but these errors were encountered:
It's being killed by the memory monitor on the HPC. So, it's in some way outside of geofetch's hands.
Two possible responses are:
you could try to put in some interrupt signal handling, to fail gracefully. but actually it looks like that may already by there.
you could try to streamline the process to not use so much memory, even if there's a big GSM file -- say, by reading it in chunks or something. You could find out what is using all the memory or and make it more efficient.
In current version of metageo_pephub I used 2 check function that will interrupt process of downloading huge GSM files and won't interrupt current cycle (run) and will continue downloading other projects. Two checks are:
geofetch checks if GSM file is bigger that 1GB. If it is, geofetch will skip this project.
Max geofetch processing time is set to 2 minutes. This will help to avoid memory errors, especially if the lists in geofetch increase exponentially.
Additionally, I have added functionality that deals with tables in soft files. This means that tables will no longer be a problem for us.
While downloading GSE178610 GitHub action failing:
I am not sure why this error is happening. I guess, it's because geofetch is creating to huge annotation dict or list.
Do you have any ideas how to handle this error and if it occurs how to skip this accession? @nleroy917 @nsheff
The text was updated successfully, but these errors were encountered: