-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Exiting because a job execution failed. #3
Comments
Hello Pierre, |
Apart from this issue, I don't recommend using Novel-X on this dataset. The specification says that it is round 20X (3 times lower than the usual coverage), and Novel-X is not supposed to work. For NA24385 dataset I used data from https://s3-us-west-2.amazonaws.com/human-pangenomics/index.html?prefix=HG002/hpp_HG002_NA24385_son_v1/10XG/giab_chromium_data/10XGenomics_ChromiumGenome_LongRanger2.2_Supernova2.0.1_04122018_high_coverage_bams/ (https://github.com/human-pangenomics/HG002_Data_Freeze_v1.0/blob/master/README.md). |
Hi, it's been a few years since this thread, does Novel-X still not work on lower coverage datasets? What's the minimum coverage you'd suggest? Maggs |
Unfortunately not...
I would suggest to start from at least 30X, but results will saturate at
around 50X
пн, 2 окт. 2023 г. в 18:10, maggs-x ***@***.***>:
… Hi, it's been a few years since this thread, does Novel-X still not work
on lower coverage datasets? What's the minimum coverage you'd suggest?
Maggs
—
Reply to this email directly, view it on GitHub
<#3 (comment)>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/ABKP3LOFJQLBJBWFYF52MKLX5LKO5AVCNFSM4SQIZ5NKU5DIOJSWCZC7NNSXTN2JONZXKZKDN5WW2ZLOOQ5TCNZUGMZDAMRQG4YA>
.
You are receiving this because you commented.Message ID:
***@***.***>
|
Awesome thanks so much!
Maggs X
they/them
________________________________
From: Dmitry Meleshko ***@***.***>
Sent: Monday, October 2, 2023 12:33:36 PM
To: 1dayac/Novel-X ***@***.***>
Cc: maggs-x ***@***.***>; Comment ***@***.***>
Subject: Re: [1dayac/Novel-X] Exiting because a job execution failed. (#3)
Unfortunately not...
I would suggest to start from at least 30X, but results will saturate at
around 50X
пн, 2 окт. 2023 г. в 18:10, maggs-x ***@***.***>:
Hi, it's been a few years since this thread, does Novel-X still not work
on lower coverage datasets? What's the minimum coverage you'd suggest?
Maggs
—
Reply to this email directly, view it on GitHub
<#3 (comment)>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/ABKP3LOFJQLBJBWFYF52MKLX5LKO5AVCNFSM4SQIZ5NKU5DIOJSWCZC7NNSXTN2JONZXKZKDN5WW2ZLOOQ5TCNZUGMZDAMRQG4YA>
.
You are receiving this because you commented.Message ID:
***@***.***>
—
Reply to this email directly, view it on GitHub<#3 (comment)>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/A7HSYLWUI5S2X7J2VZEOC5TX5L3HBAVCNFSM4SQIZ5NKU5DIOJSWCZC7NNSXTN2JONZXKZKDN5WW2ZLOOQ5TCNZUGM2DKNRVG4ZA>.
You are receiving this because you commented.Message ID: ***@***.***>
|
Hello,
Sorry to bother you again.
I'm attempting to run Novel-X on the following 10x Genomics bam file : ftp://ftp-trace.ncbi.nlm.nih.gov/giab/ftp/data/AshkenazimTrio/HG002_NA24385_son/10XGenomics/ , and on the associated reference genome.
However, after running for a while, Novel-X stops and indicates a job execution failed. Please find the log file attached here:
log.txt
I can see two warning messages:
One indicates that I seem to be using an obsolete version of Blast+ and that I should try to user a newer version. However, I don't have root access on the cluster Novel-X is running, and cannot update it myself.
The other indicates that @rg tags are missing in the BAM file. Complete warning message is here:
Do you have any idea what could be causing this issue? I'm not sure which command is associated to the "no @rg" warning, so I could not attempt to tweak it myself by adding --gemcode, --lr20, or --cr11 to see if it would work.
Thanks in advance for your help.
Best,
Pierre
edit: Forgot to mention, but I'm running on a node with 150 GB of RAM, so memory should not be a problem.
The text was updated successfully, but these errors were encountered: