-
Notifications
You must be signed in to change notification settings - Fork 125
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Why FASTQ quality value above qmax is treated as Fatal error? #522
Comments
In addition, I tried to use
Therefore, is it the best practice to use |
Reads with quality values encoded on more than 41 levels are relatively recent. Observing values outside the 0-41 range used to be the sign that something went wrong. Nowadays, maybe 0-93 should be the new default range for quality values?
Yes, finding the QMAX=$(printf "@s1\nA\n+\nI\n" | \
vsearch --fastq_chars - 2>&1 | \
grep -o -P "fastq_qmax *\K[[:digit:]]+") Here |
@torognes what do you think of that? |
I agree that the world has moved forward and that some FASTQ files now contain q values above 41. I also agree that the intention of this option was to help users detect old FASTQ files (phred 64), which is hardly a problem anymore. Setting default qmax to 93 would essentially remove its effect, as it is the highest possible quality value obtainable with any printable character. I am a bit sceptical to change default values and behaviour like this without a major version change, since it could potentially change results. But the change would only be to allow some files to be analysed which would previously fail. Compatibility with usearch is another issue. I haven't check what it does. Another possibility is to issue a warning instead of a fatal error. Perhaps a warning at the first encounter and then another one after reading the whole file with the maximum q value. |
@torognes I agree with issuing a warning instead of a fatal error. Fatal error is too nervous |
Hello i then changed --fastq_max to 90 what actually happens to the reads when i filter for a quality score of q10 again after previously filtering them for a quality score of q20? |
When I used
--fastx_filter
to trim FASTQ files (seeming like a very easy task),I encountered Fatal error as
So I added
--fastq_qmax 75
. However another Fatal error occurredAn example is https://www.ncbi.nlm.nih.gov/sra/SRR19317902, where its reads look like
Above all, what I wondered is why FASTQ quality value above qmax is treated as Fatal error? I had thought that any quality values larger than 41 would be cut off to 41.
The text was updated successfully, but these errors were encountered: