CalculiX-SCOTCH-PARSEC Docker image

Following the suggestions contained here

I’m trying to create a Docker image of CalculiX that uses SCOTCH as partitioning library and PaRSEC as scheduling library:

FROM nvidia/cuda:12.3.1-devel-ubuntu22.04

ENV DEBIAN_FRONTEND=noninteractive
# Set CUDA paths correctly
ENV CUDA_PATH=/usr/local/cuda
ENV CUDA_HOME=/usr/local/cuda
ENV PATH=${CUDA_PATH}/bin:${PATH}
ENV LD_LIBRARY_PATH=${CUDA_PATH}/lib64:${LD_LIBRARY_PATH}

# Create symlinks for CUDA includes
RUN ln -s /usr/local/cuda/include/* /usr/include/

RUN apt-get update && apt-get install -y \
    build-essential cmake git python3 python3-pip python3-dev \
    hwloc libhwloc-dev libevent-dev pkg-config automake autoconf \
    libtool flex bison libssl-dev libopenmpi-dev openmpi-bin \
    m4 wget unzip zlib1g-dev \
    libblas-dev liblapacke-dev liblapack-dev libopenblas-dev \
    && rm -rf /var/lib/apt/lists/*

RUN pip3 install --no-cache-dir numpy cython mpi4py

# Install Scotch
WORKDIR /opt
RUN wget --no-check-certificate https://gitlab.inria.fr/scotch/scotch/-/archive/master/scotch-master.tar.gz && \
    tar xf scotch-master.tar.gz && \
    cd scotch-master/src && \
    cp Make.inc/Makefile.inc.x86-64_pc_linux2 Makefile.inc && \
    # For 64-bit integers:
    sed -i 's/-DSCOTCH_PTHREAD/-DSCOTCH_PTHREAD -DINTSIZE64/g' Makefile.inc && \
    # For 32-bit integers (comment out above line and uncomment below):
    #sed -i 's/-DSCOTCH_PTHREAD/-DSCOTCH_PTHREAD/g' Makefile.inc && \
    make -j$(nproc) && \
    make install && \
    cd ../.. && \
    rm -rf scotch-master*

# Install Parsec
WORKDIR /opt
# RUN wget https://github.com/ICLDisco/parsec/archive/refs/heads/master.zip && \
    # unzip master.zip && \
    # mv parsec-master parsec
    RUN wget https://github.com/ICLDisco/parsec/archive/refs/tags/parsec-3.0.2209.zip && \
    unzip parsec-3.0.2209.zip && \
    mv parsec-parsec-3.0.2209 parsec

WORKDIR /opt/parsec
RUN cmake -S . -B build \
    -DPARSEC_GPU_WITH_CUDA=ON \
    -DPARSEC_DIST_WITH_MPI=ON \
    -DPARSEC_DIST_COLLECTIVES=ON \
    -DPARSEC_DEBUG=OFF \
    -DPARSEC_PROF_TRACE=OFF \
    -DPARSEC_PROF_DRY_RUN=OFF \
    -DPARSEC_PROF_GRAPHER=OFF \
    # For 64-bit integers:
    -DPARSEC_INTEGER_SIZE=64 \
    # For 32-bit integers (comment out above line and uncomment below):
    #-DPARSEC_INTEGER_SIZE=32 \
    -DPARSEC_WITH_DEVEL_HEADERS=ON \
    -DPARSEC_ARENA_ALLOCATOR=ON \
    -DCMAKE_CUDA_COMPILER=/usr/local/cuda/bin/nvcc \
    && cmake --build build -j && \
    cmake --install build

# Install PaStiX
WORKDIR /opt
RUN git clone https://gitlab.inria.fr/solverstack/pastix.git && \
    cd pastix && \
    cmake -S . -B build \
    -DPASTIX_WITH_MPI=ON \
    -DPASTIX_WITH_CUDA=ON \
    -DPASTIX_WITH_PARSEC=ON \
    -DPASTIX_WITH_SCOTCH=ON \
    # For 64-bit integers:
    -DPASTIX_INT64=ON \
    # For 32-bit integers (comment out above line and uncomment below):
    #-DPASTIX_INT64=OFF \
    -DPASTIX_ORDER_SCOTCH=ON \
    -DPASTIX_ORDERING_SCOTCH=ON \
    -DPASTIX_MAX_BLOCKSIZE=2048 \
    -DPASTIX_SPLIT_SIZE=1024 \
    -DPASTIX_TASKING_THRESHOLD=128 \
    -DCMAKE_BUILD_TYPE=Release \
    -DCMAKE_CUDA_COMPILER=/usr/local/cuda/bin/nvcc \
    -DCUDA_TOOLKIT_ROOT_DIR=/usr/local/cuda \
    && cmake --build build -j$(nproc) && \
    cmake --install build

ENV PATH="/usr/local/bin:${PATH}"
ENV LD_LIBRARY_PATH="/usr/local/lib:${LD_LIBRARY_PATH}"

WORKDIR /workspace
CMD ["/bin/bash"]

Unfortunately I get an error suggesting that there is an issue with some arenas member:

105.8 /opt/pastix/build/sopalin/parsec/parsec_sgetrf.c:99:51: error: 'parsec_sgetrf_sp1dplus_taskpool_t' {aka 'struct parsec_sgetrf_sp1dplus_taskpool_s'} has no member named 'arenas'
105.8    99 |     parsec_arena_construct( parsec_sgetrf_sp1dplus->arenas[PARSEC_sgetrf_sp1dplus_DEFAULT_ARENA],
105.8       |                                                   ^~
105.8 /opt/pastix/build/sopalin/parsec/parsec_sgetrf.c:99:5: error: too many arguments to function 'parsec_arena_construct'

Thanks in advance for any help.

Currently, CalculiX uses a modified version of PaStiX 6.0.1.