-
Notifications
You must be signed in to change notification settings - Fork 42
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
BUILD FAILURE #251
Comments
Hi @shibuvp! Sorry for the confusion, docs are a bit out of date. Avocado needs to be built for Spark 2, due to a bug in Spark SQL in Spark 1.6.x. |
thanks @fnothaft |
No problem! Please leave this issue open; it'll remind me to update the docs. |
ok, sure! |
Sure, can do. What genome build are you on? |
im using hg19.fa with paired fastq file on adam |
Does your reference build have "chr" suffixes on the chromosome names? |
no, but when i was trying to visualize the genome with sample adam file and test vcf file using mango genomic browser then i could see the "chr" like chr1, chr2, chr3 etc upto chr22 |
OK, interesting. BTW, what is the approximate coverage of your file? |
you mean, which file? |
The aligned reads or FASTQ. |
I build avocado, build failed and get this error `- score snp in a read with no evidence of the snp
|
sry sir, i really don't know what the approximate coverage is, i'm newbud in the genomics |
That error is expected on Spark 1.6.2. Can you run No worries about the coverage. I'll pull together some default parameters for you in a sec. |
okey sir |
https://gist.github.com/fnothaft/627f4e295f9400cff453b6c3f671fd7d should be a reasonable set of default parameters. Let me know if you run into any problems! |
okey thanks sir! |
can i use adam file as input reads? |
Yup, ADAM and SAM/BAM/CRAM are both fine. |
ok, i will try |
Thanks sir avocado build success |
You're welcome—glad to hear it! |
i'm getting this error while running avocado command is,
|
is it the problem of scala version? |
That's an odd error; I don't think I've seen it before. Did you compile for Scala 2.10 or 2.11? The Spark 2.x pre-built distributions are packaged for Scala 2.11, so I'd generally recommend building for Scala 2.11. |
oh .. i have scala 2.10 on my cluster |
@fnothaft i was changed the scala version into 2.11.8 for spark 2.1.0, but the error is still same! |
the issue i found in my cluster was, the version of scala. It was 2.11.8 whereas pom.xml in avocado contains scala of 2.10.4 . So i had changed the pom.xml file scala version into 2.11.8 and give rebuild , but it shows error and build getting failed. [root@master avocado]# mvn package can you please check it @fnothaft ? |
i'm getting this error while building avocado
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] avocado: A Variant Caller, Distributed ............. SUCCESS [ 2.877 s]
[INFO] avocado-core: Core variant calling algorithms ...... FAILURE [04:43 min]
[INFO] avocado-cli: Command line interface for a distributed variant caller SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 04:46 min
[INFO] Finished at: 2017-07-06T11:12:18+05:30
[INFO] Final Memory: 29M/502M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.scalatest:scalatest-maven-plugin:1.0:test (test) on project avocado-core_2.10: There are test failures -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn -rf :avocado-core_2.10
The text was updated successfully, but these errors were encountered: