Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[CI] Use Spack + OCI buildcache when widely available for system MPI #744

Closed
giordano opened this issue Jun 29, 2023 · 3 comments · Fixed by #788
Closed

[CI] Use Spack + OCI buildcache when widely available for system MPI #744

giordano opened this issue Jun 29, 2023 · 3 comments · Fixed by #788

Comments

@giordano
Copy link
Member

giordano commented Jun 29, 2023

Ref: https://github.com/haampie/spack-oci-buildcache-example

For example:

mose@Moses-MBP ~ % docker run --rm ghcr.io/haampie/spack-oci-buildcache-example:mvapich-3.0b-hoybelw6cwy4n3zdw2hgc3jziwaaisu4 mpirun --version
WARNING: The requested image's platform (linux/amd64) does not match the detected host platform (linux/arm64/v8) and no specific platform was requested
HYDRA build details:
    Version:                                 3.4.3
    Release Date:                            General Availability Release
    CC:                              /home/runner/work/spack-oci-buildcache-example/spack-oci-buildcache-example/spack/lib/spack/env/gcc/gcc
    Configure options:                       '--disable-option-checking' '--prefix=/home/runner/work/spack-oci-buildcache-example/spack-oci-buildcache-example/spack/opt/spack/linux-ubuntu22.04-x86_64_v2/gcc-12.1.0/mvapich-3.0b-hoybelw6cwy4n3zdw2hgc3jziwaaisu4' '--enable-shared' '--enable-romio' '--disable-silent-rules' '--disable-new-dtags' '--enable-fortran=all' '--enable-threads=multiple' '--with-ch3-rank-bits=32' '--enable-wrapper-rpath=yes' '--disable-alloca' '--with-pmi=simple' '--enable-fast=all' '--disable-cuda' '--enable-registration-cache' '--with-device=ch4:ofi' 'CC=/home/runner/work/spack-oci-buildcache-example/spack-oci-buildcache-example/spack/lib/spack/env/gcc/gcc' 'CXX=/home/runner/work/spack-oci-buildcache-example/spack-oci-buildcache-example/spack/lib/spack/env/gcc/g++' 'FC=/home/runner/work/spack-oci-buildcache-example/spack-oci-buildcache-example/spack/lib/spack/env/gcc/gfortran' 'F77=/home/runner/work/spack-oci-buildcache-example/spack-oci-buildcache-example/spack/lib/spack/env/gcc/gfortran' '--cache-file=/dev/null' '--srcdir=.' 'CFLAGS= -DNDEBUG -DNVALGRIND -O2' 'LDFLAGS=' 'LIBS=-lm ' 'CPPFLAGS= -DNETMOD_INLINE=__netmod_inline_ofi__ -I/tmp/runner/spack-stage/spack-stage-mvapich-3.0b-hoybelw6cwy4n3zdw2hgc3jziwaaisu4/spack-src/src/mpl/include -I/tmp/runner/spack-stage/spack-stage-mvapich-3.0b-hoybelw6cwy4n3zdw2hgc3jziwaaisu4/spack-src/src/mpl/include -I/tmp/runner/spack-stage/spack-stage-mvapich-3.0b-hoybelw6cwy4n3zdw2hgc3jziwaaisu4/spack-src/modules/yaksa/src/frontend/include -I/tmp/runner/spack-stage/spack-stage-mvapich-3.0b-hoybelw6cwy4n3zdw2hgc3jziwaaisu4/spack-src/modules/yaksa/src/frontend/include -I/tmp/runner/spack-stage/spack-stage-mvapich-3.0b-hoybelw6cwy4n3zdw2hgc3jziwaaisu4/spack-src/modules/json-c -I/tmp/runner/spack-stage/spack-stage-mvapich-3.0b-hoybelw6cwy4n3zdw2hgc3jziwaaisu4/spack-src/modules/json-c -D_REENTRANT -I/tmp/runner/spack-stage/spack-stage-mvapich-3.0b-hoybelw6cwy4n3zdw2hgc3jziwaaisu4/spack-src/src/mpi/romio/include' 'MPLLIBNAME=mpl'
    Process Manager:                         pmi
    Launchers available:                     ssh rsh fork slurm ll lsf sge manual persist
    Topology libraries available:            hwloc
    Resource management kernels available:   user slurm ll lsf sge pbs cobalt
    Demux engines available:                 poll select

This is very work-in-progress at the moment (spack/spack#38358), but it will save us lots of time to install all various system MPI libraries.

CC: @haampie

@giordano
Copy link
Member Author

giordano commented Nov 3, 2023

Can one of the admins of the organisation make packages public, or at least those in https://github.com/JuliaParallel/github-actions-buildcache (settings)? CC: @simonbyrne @vchuravy

@simonbyrne
Copy link
Member

Did someone do it? or do i still need to do something?

@giordano
Copy link
Member Author

giordano commented Nov 3, 2023

Valentin did it, now testing in #788

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants