Skip to content

Instantly share code, notes, and snippets.

@bbpbuildbot
Created April 13, 2022 10:47
Show Gist options
  • Save bbpbuildbot/d29a24a8df9c42476d54b45ca828716f to your computer and use it in GitHub Desktop.
Save bbpbuildbot/d29a24a8df9c42476d54b45ca828716f to your computer and use it in GitHub Desktop.
Logfiles for GitLab pipeline https://bbpgitlab.epfl.ch/hpc/coreneuron/-/pipelines/48472 (:white_check_mark:) running on GitHub PR BlueBrain/CoreNeuron#797.
Running with gitlab-runner 14.3.2 (e0218c92)
 on BB5 map runner pnPo3yJy
section_start:1649843398:resolve_secrets Resolving secrets
section_end:1649843398:resolve_secrets section_start:1649843398:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 14.3.2, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor018087411, slurm job id
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 48472, build ref 9a466a6e2a1c807e4d6045f3fc0b1fbaf51370b6, job ID 211023
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472, optional exclusive flag , optional cpus per task flag --cpus-per-task=8, optional qos flag
A slurm job will be created with name GL_J211023_PROD_P112_CP7_C16
Job parameters: memory=76G, cpus_per_task=8, duration=1:00:00, constraint=cpu ntasks=2 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
sbatch: INFO: Job specifies cpu constraint, setting --constraint=[skl|clx]
Submitted batch job 379124
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=2 --cpus-per-task=8 --mem=76G --job-name=GL_J211023_PROD_P112_CP7_C16 -C cpu --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472 --ntasks=2 --jobid=379124 --cpus-per-task=8 --mem=76G
section_end:1649843399:prepare_executor section_start:1649843399:prepare_script Preparing environment
Running on r2i2n21 via bbpv1.epfl.ch...
section_end:1649843402:prepare_script section_start:1649843402:get_sources Getting source from Git repository
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1649843403:get_sources section_start:1649843403:restore_cache Restoring cache
Checking cache for build:coreneuron:mod2c:intel-8...
Runtime platform  arch=amd64 os=linux pid=30227 revision=58ba2b95 version=14.2.0
cache.zip is up to date 
Successfully extracted cache
section_end:1649843407:restore_cache section_start:1649843407:download_artifacts Downloading artifacts
Downloading artifacts for spack_setup (211016)...
Runtime platform  arch=amd64 os=linux pid=30311 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=211016 responseStatus=200 OK token=EkaigVme
section_end:1649843407:download_artifacts section_start:1649843407:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
$ SPACK_BUILD="${PWD}/spack-build"
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ export SPACK_USER_CONFIG_PATH=${PWD}/spack-config
$ mkdir ${SPACK_USER_CONFIG_PATH}
$ cat > ${SPACK_USER_CONFIG_PATH}/config.yaml << END_SCRIPT # collapsed multi-line command
$ spack config blame config
--- config:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211023/spack-config/config.yaml:2 build_stage:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211023/spack-config/config.yaml:3 - /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211023/spack-build
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211023/spack-config/config.yaml:4 source_cache: /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211023/spack-source-cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/config.yaml:2 ccache: True
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:2 install_tree:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:3 root: $user_cache_path/software
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:4 projections:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:5 all: install_{compiler.name}-{compiler.version}-{target}/{name}-{version}-{hash:6}
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:6 module_roots:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:7 tcl: $user_cache_path/modules
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:9 environments:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:10 root: $user_cache_path/environments
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:88 concretization: separately
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:12 build_jobs: 8
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:18 extensions:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:18 - $spack/bluebrain/spack-scripting
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:19 # This is the path to the root of the Spack install tree.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:19 # You can use $spack here to refer to the root of the spack instance.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:37 template_dirs:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:37 - $spack/share/spack/templates
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Temporary locations Spack can try to use for builds.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Recommended options are given below.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Builds can be faster in temporary directories on some (e.g., HPC) systems.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Specifying `$tempdir` will ensure use of the default temporary directory
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # (i.e., ``$TMP` or ``$TMPDIR``).
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Another option that prevents conflicts and potential permission issues is
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # to specify `$user_cache_path/stage`, which ensures each user builds in their
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # home directory.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # A more traditional path uses the value of `$spack/var/spack/stage`, which
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # builds directly inside Spack's instance without staging them in a
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # temporary space. Problems with specifying a path inside a Spack instance
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # are that it precludes its use as a system package and its ability to be
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # pip installable.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # In any case, if the username is not already in the path, Spack will append
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # the value of `$user` in an attempt to avoid potential conflicts between
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # users in shared temporary spaces.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # The build stage can be purged with `spack clean --stage` and
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # `spack clean -a`, so it is important that the specified directory uniquely
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # identifies Spack staging to avoid accidentally wiping out non-Spack work.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:73 test_stage: $user_cache_path/test
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:82 misc_cache: $user_cache_path/cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:94 connect_timeout: 10
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:99 verify_ssl: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:107 suppress_gpg_warnings: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:113 install_missing_compilers: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:118 checksum: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:123 deprecated: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:128 dirty: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:136 build_language: C
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:145 locks: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:150 url_fetch_method: urllib
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:176 concretizer: clingo
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:183 db_lock_timeout: 300
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:191 package_lock_timeout: null
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:197 shared_linking: rpath
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:202 allow_sgid: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:207 terminal_title: False
_builtin debug: False
$ export XDG_CONFIG_HOME=${CI_BUILDS_DIR}/J${CI_JOB_ID}_local_config
$ echo "Configuring git to use CI_JOB_TOKEN to access [email protected] (${XDG_CONFIG_HOME})"
Configuring git to use CI_JOB_TOKEN to access [email protected] (/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472/J211023_local_config)
$ mkdir -p "${XDG_CONFIG_HOME}/git"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = [email protected]:" > "${XDG_CONFIG_HOME}/git/config"
$ cat "${XDG_CONFIG_HOME}/git/config"
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = [email protected]:
$ SPACK_FULL_SPEC="${SPACK_PACKAGE}${SPACK_PACKAGE_COMPILER:+%}${SPACK_PACKAGE_COMPILER} ${SPACK_PACKAGE_SPEC} ${SPACK_PACKAGE_DEPENDENCIES} ${SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB}"
$ echo "Preparing to install ${SPACK_FULL_SPEC}"
Preparing to install coreneuron%intel +tests~legacy-unit build_type=Debug
$ JSON_SPEC=$(spack spec --json ${SPACK_FULL_SPEC})
$ SPACK_INSTALLED_HASH=$(module load unstable python; echo "${JSON_SPEC}" | python -c "import json, sys; print(json.loads(sys.stdin.read())[\"spec\"][\"nodes\"][0][\"hash\"])")
$ echo "Determined its hash will be ${SPACK_INSTALLED_HASH}"
Determined its hash will be hsyhen5wbojr4dsx5rf32wndj3dpm5nb
$ SPACK_STAGE_DIR=${SPACK_BUILD}/spack-stage-${SPACK_PACKAGE}-develop-${SPACK_INSTALLED_HASH}
$ SPACK_BUILD_DIR=${SPACK_STAGE_DIR}/spack-build-${SPACK_INSTALLED_HASH:0:7}
$ SPACK_SOURCE_DIR=${SPACK_STAGE_DIR}/spack-src
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ module load unstable ccache
$ export CCACHE_BASEDIR=$(realpath -P ${CI_BUILDS_DIR})
$ echo CCACHE_BASEDIR=${CCACHE_BASEDIR}
CCACHE_BASEDIR=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472
$ export CCACHE_MAXSIZE=512M
$ export CCACHE_DIR="${TMPDIR}/ccache"
$ mkdir -p ${CCACHE_DIR}
$ if [ -f ${CI_PROJECT_DIR}/ccache.tar ]; then
$ tar -C "${CCACHE_DIR}" -xf "${CI_PROJECT_DIR}/ccache.tar"
$ fi
$ ccache --zero-stats
Statistics zeroed
$ ccache --show-stats --verbose
Summary:
Cache directory: /gpfs/bbp.cscs.ch/ssd/slurmTmpFS/bbpcihpcproj12/379124/ccache
Primary config: /gpfs/bbp.cscs.ch/ssd/slurmTmpFS/bbpcihpcproj12/379124/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Apr 13 11:50:37 2022
Hits: 0 / 0
Direct: 0 / 0
Preprocessed: 0 / 0
Misses: 0
Direct: 0
Preprocessed: 0
Primary storage:
Hits: 0 / 0
Misses: 0
Cache size (GB): 0.39 / 0.51 (75.50 %)
Files: 6617
$ fi
$ module load unstable git ${SPACK_EXTRA_MODULES}
$ spack uninstall -y --dependents /${SPACK_INSTALLED_HASH} || true
==> Error: No installed spec matches the hash: 'hsyhen5wbojr4dsx5rf32wndj3dpm5nb'
$ spack spec -Il ${SPACK_FULL_SPEC}
Input spec
--------------------------------
- coreneuron%intel~legacy-unit+tests build_type=Debug
Concretized
--------------------------------
- hsyhen5 coreneuron@develop%[email protected]~caliper~codegenopt~gpu~ipo~ispc~knl~legacy-unit+mpi~nmodl~openmp~profile+report+shared~sympy~sympyopt+tests~unified build_type=Debug arch=linux-rhel7-skylake
- dwdch6b ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] dl7pfht ^[email protected]%[email protected]+atomic+chrono~clanglibcpp~container~context~coroutine+date_time~debug+exception~fiber+filesystem+graph~icu+iostreams+locale+log+math~mpi+multithreaded~numpy+pic+program_options~python+random+regex+serialization+shared+signals~singlethreaded+system~taggedlayout+test+thread+timer~versionedlayout+wave cxxstd=98 visibility=hidden arch=linux-rhel7-skylake
- z3q5f3x ^[email protected]%[email protected]~doc+ncurses+openssl+ownlibs~qt build_type=Release arch=linux-rhel7-skylake
- s4ueg72 ^[email protected]%[email protected]+lex~nls arch=linux-rhel7-skylake
[^] 3narjkw ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] nfrzrlo ^[email protected]%[email protected]~ipo+mpi build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ofveh3j ^[email protected]%[email protected]+cxx~fortran+hl~ipo~java+mpi+shared~szip~threadsafe+tools api=default build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] xys6npz ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] e7w5nez ^[email protected]%[email protected]+optimize+pic+shared arch=linux-rhel7-skylake
[^] wk3uenv ^[email protected]%[email protected]~ipo+shared build_type=RelWithDebInfo arch=linux-rhel7-skylake
- 6ggc5yr ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] f27a7nn ^[email protected]%[email protected]+bz2+ctypes+dbm~debug+libxml2+lzma~nis~optimizations+pic+pyexpat+pythoncmd+readline+shared+sqlite3+ssl~tix~tkinter~ucs4+uuid+zlib patches=0d98e93189bc278fbc37a50ed7f183bd8aaf249a8e1670a465f0db6bb4f8cf87,4c2457325f2b608b1b6a2c63087df8c26e07db3e3d493caf36a56f0ecf6fb768,f2fd060afc4b4618fe8104c4c5d771f36dc55b1db5a4623785a4ea707ec72fb4 arch=linux-rhel7-skylake
[^] gdhqypa ^[email protected]%[email protected]~ipo~profile+shared~tests build_type=RelWithDebInfo arch=linux-rhel7-skylake
$ spack ${SPACK_INSTALL_EXTRA_FLAGS} install -j${SLURM_CPUS_PER_TASK} --log-format=junit --log-file=${CI_PROJECT_DIR}/install.xml --keep-stage ${SPACK_FULL_SPEC} || install_failed=1
==> [email protected] : has external module in ['bison/3.8.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/bison-3.8.2-ej2cew (external bison-3.8.2-dwdch6bmdeclr2novthsywtrryotawwz)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/boost-1.78.0-hiorc7 (external boost-1.78.0-dl7pfhtsvyaq3273gr2g3l5vr37eeydc)
==> [email protected] : has external module in ['cmake/3.21.4']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/cmake-3.21.4-cdyb7k (external cmake-3.21.4-z3q5f3xwpuibd3qbgdscqmu3efarbu42)
==> [email protected] : has external module in ['flex/2.6.3']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/flex-2.6.3-3mtl7j (external flex-2.6.3-s4ueg72j7l6vkdyvfxj2tweo7v7s3otx)
==> [email protected] : has external module in ['hpe-mpi/2.25.hmpt']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/hpe-mpi-2.25.hmpt-4ukyxt (external hpe-mpi-2.25.hmpt-3narjkwjh6i2jr3zl3g5wdjlqi52hkwh)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/pkgconf-1.8.0-xys6np
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/zlib-1.2.11-mswjss (external zlib-1.2.11-e7w5nezpwf572epfhbosqfzboztysout)
==> [email protected] : has external module in ['ninja/1.10.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ninja-1.10.2-eh33rh (external ninja-1.10.2-6ggc5yre7qddwxdjmn7sfptpdoiy4dtp)
==> [email protected] : has external module in ['python/3.9.7']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/python-3.9.7-yj5alh (external python-3.9.7-f27a7nnhppeztxijpispnaedcfo2vjxi)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/spdlog-1.9.2-wk3uen
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/reportinglib-2.5.6-gdhqyp
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/hdf5-1.10.7-ofveh3
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/libsonata-report-1.1-nfrzrl
==> Installing coreneuron-develop-hsyhen5wbojr4dsx5rf32wndj3dpm5nb
==> No binary for coreneuron-develop-hsyhen5wbojr4dsx5rf32wndj3dpm5nb found: installing from source
==> Warning: Expected user 904556 to own /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472, but it is owned by 0
==> No patches needed for coreneuron
==> coreneuron: Executing phase: 'cmake'
==> coreneuron: Executing phase: 'build'
==> coreneuron: Executing phase: 'install'
==> coreneuron: Successfully installed coreneuron-develop-hsyhen5wbojr4dsx5rf32wndj3dpm5nb
Fetch: 4.60s. Build: 30.19s. Total: 34.79s.
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/software/install_intel-2021.4.0-skylake/coreneuron-develop-hsyhen
$ chmod -R g+rX "${SPACK_BUILD}"
$ if [[ ${install_failed} == 1 ]]; then exit 1; fi
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ ccache --cleanup
$ ccache --show-stats --verbose
Summary:
Cache directory: /gpfs/bbp.cscs.ch/ssd/slurmTmpFS/bbpcihpcproj12/379124/ccache
Primary config: /gpfs/bbp.cscs.ch/ssd/slurmTmpFS/bbpcihpcproj12/379124/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Apr 13 11:51:56 2022
Hits: 109 / 116 (93.97 %)
Direct: 91 / 116 (78.45 %)
Preprocessed: 18 / 25 (72.00 %)
Misses: 7
Direct: 25
Preprocessed: 7
Uncacheable: 18
Primary storage:
Hits: 200 / 232 (86.21 %)
Misses: 32
Cache size (GB): 0.38 / 0.51 (74.25 %)
Files: 6631
Uncacheable:
Called for linking: 15
No input file: 3
$ tar -C "${CCACHE_DIR}" -cf "${CI_PROJECT_DIR}/ccache.tar" .
$ fi
$ touch ${SPACK_STAGE_DIR}/spack-configure-args.txt
$ cp ${SPACK_STAGE_DIR}/spack-{build-env,build-out,configure-args}.txt ${CI_PROJECT_DIR}/
$ echo "SPACK_BUILD_DIR=${SPACK_BUILD_DIR}" > ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_FULL_SPEC=${SPACK_FULL_SPEC}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_SOURCE_DIR=${SPACK_SOURCE_DIR}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_INSTALLED_HASH=${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB=^/${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ num_failures=$(module load unstable python-dev; python -c "from lxml import etree; xml = etree.parse('${CI_PROJECT_DIR}/install.xml'); print(sum(1 for _ in xml.getroot().iter('failure')) + sum(1 for _ in xml.getroot().iter('error')))")
Autoloading python/3.9.7
Autoloading py-virtualenv/16.7.6
Autoloading py-scipy/1.7.1
Autoloading py-numpy/1.19.5
Autoloading py-mpi4py/3.1.2
Autoloading hpe-mpi/2.25.hmpt
Autoloading py-h5py/3.4.0
Autoloading hdf5/1.10.7
$ if [[ ${num_failures} > 0 ]]; then exit ${num_failures}; fi
section_end:1649843518:step_script section_start:1649843518:archive_cache Saving cache for successful job
Creating cache build:coreneuron:mod2c:intel-8...
Runtime platform  arch=amd64 os=linux pid=32605 revision=58ba2b95 version=14.2.0
ccache.tar: found 1 matching files and directories
Uploading cache.zip to https://bbpobjectstorage.epfl.ch/gitlab-runners-cache/project/112/build%3Acoreneuron%3Amod2c%3Aintel-8
Created cache
section_end:1649843534:archive_cache section_start:1649843534:upload_artifacts_on_success Uploading artifacts for successful job
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=32750 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
initial_environment.env: found 1 matching files and directories
spack-build-env.txt: found 1 matching files and directories
spack-build-out.txt: found 1 matching files and directories
spack-configure-args.txt: found 1 matching files and directories
Uploading artifacts as "archive" to coordinator... ok id=211023 responseStatus=201 Created token=SAThh_5K
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=32790 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
Uploading artifacts as "junit" to coordinator... ok id=211023 responseStatus=201 Created token=SAThh_5K
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=32835 revision=58ba2b95 version=14.2.0
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
Uploading artifacts as "dotenv" to coordinator... ok id=211023 responseStatus=201 Created token=SAThh_5K
section_end:1649843536:upload_artifacts_on_success section_start:1649843536:cleanup_file_variables Cleaning up project directory and file based variables
section_end:1649843537:cleanup_file_variables Job succeeded
Running with gitlab-runner 14.3.2 (e0218c92)
 on BB5 map runner pnPo3yJy
section_start:1649843397:resolve_secrets Resolving secrets
section_end:1649843397:resolve_secrets section_start:1649843397:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 14.3.2, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor062257241, slurm job id
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 48472, build ref 9a466a6e2a1c807e4d6045f3fc0b1fbaf51370b6, job ID 211020
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472, optional exclusive flag , optional cpus per task flag --cpus-per-task=8, optional qos flag
A slurm job will be created with name GL_J211020_PROD_P112_CP6_C11
Job parameters: memory=76G, cpus_per_task=8, duration=1:00:00, constraint=cpu ntasks=2 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
sbatch: INFO: Job specifies cpu constraint, setting --constraint=[skl|clx]
Submitted batch job 379123
job state: PD
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=2 --cpus-per-task=8 --mem=76G --job-name=GL_J211020_PROD_P112_CP6_C11 -C cpu --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472 --ntasks=2 --jobid=379123 --cpus-per-task=8 --mem=76G
section_end:1649843400:prepare_executor section_start:1649843400:prepare_script Preparing environment
Running on r2i2n20 via bbpv1.epfl.ch...
section_end:1649843403:prepare_script section_start:1649843403:get_sources Getting source from Git repository
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1649843404:get_sources section_start:1649843404:restore_cache Restoring cache
Checking cache for build:coreneuron:mod2c:nvhpc:acc:unified-8...
Runtime platform  arch=amd64 os=linux pid=231975 revision=58ba2b95 version=14.2.0
cache.zip is up to date 
Successfully extracted cache
section_end:1649843407:restore_cache section_start:1649843407:download_artifacts Downloading artifacts
Downloading artifacts for spack_setup (211016)...
Runtime platform  arch=amd64 os=linux pid=232495 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=211016 responseStatus=200 OK token=EkaigVme
section_end:1649843408:download_artifacts section_start:1649843408:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
$ SPACK_BUILD="${PWD}/spack-build"
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ export SPACK_USER_CONFIG_PATH=${PWD}/spack-config
$ mkdir ${SPACK_USER_CONFIG_PATH}
$ cat > ${SPACK_USER_CONFIG_PATH}/config.yaml << END_SCRIPT # collapsed multi-line command
$ spack config blame config
--- config:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211020/spack-config/config.yaml:2 build_stage:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211020/spack-config/config.yaml:3 - /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211020/spack-build
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211020/spack-config/config.yaml:4 source_cache: /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211020/spack-source-cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/config.yaml:2 ccache: True
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:2 install_tree:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:3 root: $user_cache_path/software
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:4 projections:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:5 all: install_{compiler.name}-{compiler.version}-{target}/{name}-{version}-{hash:6}
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:6 module_roots:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:7 tcl: $user_cache_path/modules
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:9 environments:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:10 root: $user_cache_path/environments
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:88 concretization: separately
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:12 build_jobs: 8
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:18 extensions:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:18 - $spack/bluebrain/spack-scripting
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:19 # This is the path to the root of the Spack install tree.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:19 # You can use $spack here to refer to the root of the spack instance.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:37 template_dirs:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:37 - $spack/share/spack/templates
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Temporary locations Spack can try to use for builds.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Recommended options are given below.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Builds can be faster in temporary directories on some (e.g., HPC) systems.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Specifying `$tempdir` will ensure use of the default temporary directory
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # (i.e., ``$TMP` or ``$TMPDIR``).
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Another option that prevents conflicts and potential permission issues is
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # to specify `$user_cache_path/stage`, which ensures each user builds in their
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # home directory.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # A more traditional path uses the value of `$spack/var/spack/stage`, which
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # builds directly inside Spack's instance without staging them in a
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # temporary space. Problems with specifying a path inside a Spack instance
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # are that it precludes its use as a system package and its ability to be
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # pip installable.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # In any case, if the username is not already in the path, Spack will append
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # the value of `$user` in an attempt to avoid potential conflicts between
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # users in shared temporary spaces.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # The build stage can be purged with `spack clean --stage` and
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # `spack clean -a`, so it is important that the specified directory uniquely
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # identifies Spack staging to avoid accidentally wiping out non-Spack work.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:73 test_stage: $user_cache_path/test
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:82 misc_cache: $user_cache_path/cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:94 connect_timeout: 10
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:99 verify_ssl: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:107 suppress_gpg_warnings: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:113 install_missing_compilers: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:118 checksum: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:123 deprecated: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:128 dirty: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:136 build_language: C
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:145 locks: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:150 url_fetch_method: urllib
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:176 concretizer: clingo
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:183 db_lock_timeout: 300
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:191 package_lock_timeout: null
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:197 shared_linking: rpath
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:202 allow_sgid: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:207 terminal_title: False
_builtin debug: False
$ export XDG_CONFIG_HOME=${CI_BUILDS_DIR}/J${CI_JOB_ID}_local_config
$ echo "Configuring git to use CI_JOB_TOKEN to access [email protected] (${XDG_CONFIG_HOME})"
Configuring git to use CI_JOB_TOKEN to access [email protected] (/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472/J211020_local_config)
$ mkdir -p "${XDG_CONFIG_HOME}/git"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = [email protected]:" > "${XDG_CONFIG_HOME}/git/config"
$ cat "${XDG_CONFIG_HOME}/git/config"
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = [email protected]:
$ SPACK_FULL_SPEC="${SPACK_PACKAGE}${SPACK_PACKAGE_COMPILER:+%}${SPACK_PACKAGE_COMPILER} ${SPACK_PACKAGE_SPEC} ${SPACK_PACKAGE_DEPENDENCIES} ${SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB}"
$ echo "Preparing to install ${SPACK_FULL_SPEC}"
Preparing to install coreneuron%nvhpc +gpu+unified+openmp+tests~legacy-unit build_type=RelWithDebInfo
$ JSON_SPEC=$(spack spec --json ${SPACK_FULL_SPEC})
$ SPACK_INSTALLED_HASH=$(module load unstable python; echo "${JSON_SPEC}" | python -c "import json, sys; print(json.loads(sys.stdin.read())[\"spec\"][\"nodes\"][0][\"hash\"])")
$ echo "Determined its hash will be ${SPACK_INSTALLED_HASH}"
Determined its hash will be knqybtff3rwigwjx34v3sgxt6qlziega
$ SPACK_STAGE_DIR=${SPACK_BUILD}/spack-stage-${SPACK_PACKAGE}-develop-${SPACK_INSTALLED_HASH}
$ SPACK_BUILD_DIR=${SPACK_STAGE_DIR}/spack-build-${SPACK_INSTALLED_HASH:0:7}
$ SPACK_SOURCE_DIR=${SPACK_STAGE_DIR}/spack-src
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ module load unstable ccache
$ export CCACHE_BASEDIR=$(realpath -P ${CI_BUILDS_DIR})
$ echo CCACHE_BASEDIR=${CCACHE_BASEDIR}
CCACHE_BASEDIR=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472
$ export CCACHE_MAXSIZE=512M
$ export CCACHE_DIR="${TMPDIR}/ccache"
$ mkdir -p ${CCACHE_DIR}
$ if [ -f ${CI_PROJECT_DIR}/ccache.tar ]; then
$ tar -C "${CCACHE_DIR}" -xf "${CI_PROJECT_DIR}/ccache.tar"
$ fi
$ ccache --zero-stats
Statistics zeroed
$ ccache --show-stats --verbose
Summary:
Cache directory: /gpfs/bbp.cscs.ch/ssd/slurmTmpFS/bbpcihpcproj12/379123/ccache
Primary config: /gpfs/bbp.cscs.ch/ssd/slurmTmpFS/bbpcihpcproj12/379123/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Apr 13 11:50:38 2022
Hits: 0 / 0
Direct: 0 / 0
Preprocessed: 0 / 0
Misses: 0
Direct: 0
Preprocessed: 0
Primary storage:
Hits: 0 / 0
Misses: 0
Cache size (GB): 0.26 / 0.51 (49.99 %)
Files: 8156
$ fi
$ module load unstable git ${SPACK_EXTRA_MODULES}
$ spack uninstall -y --dependents /${SPACK_INSTALLED_HASH} || true
==> Error: No installed spec matches the hash: 'knqybtff3rwigwjx34v3sgxt6qlziega'
$ spack spec -Il ${SPACK_FULL_SPEC}
Input spec
--------------------------------
- coreneuron%nvhpc+gpu~legacy-unit+openmp+tests+unified build_type=RelWithDebInfo
Concretized
--------------------------------
- knqybtf coreneuron@develop%[email protected]~caliper~codegenopt+gpu~ipo~ispc~knl~legacy-unit+mpi~nmodl+openmp~profile+report+shared~sympy~sympyopt+tests+unified build_type=RelWithDebInfo arch=linux-rhel7-skylake
- 6s6wcfe ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] vn2t5vi ^[email protected]%[email protected]+atomic+chrono~clanglibcpp~container~context~coroutine+date_time~debug+exception~fiber+filesystem+graph~icu+iostreams+locale+log+math~mpi+multithreaded~numpy+pic+program_options~python+random+regex+serialization+shared+signals~singlethreaded+system~taggedlayout+test+thread+timer~versionedlayout+wave cxxstd=98 visibility=hidden arch=linux-rhel7-skylake
- ucwiakr ^[email protected]%[email protected]~doc+ncurses+openssl+ownlibs~qt build_type=Release arch=linux-rhel7-skylake
[^] gi5x2dn ^[email protected]%[email protected]~allow-unsupported-compilers~dev arch=linux-rhel7-skylake
- ajxdymq ^[email protected]%[email protected]+lex~nls arch=linux-rhel7-skylake
[^] pjmdwuu ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] vtyvxbi ^[email protected]%[email protected]~ipo+mpi build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] tbfoeg7 ^[email protected]%[email protected]+cxx~fortran+hl~ipo~java+mpi+shared~szip~threadsafe+tools api=default build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] 6dfyugs ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] h7lotu6 ^[email protected]%[email protected]+optimize+pic+shared arch=linux-rhel7-skylake
[^] snd2rt6 ^[email protected]%[email protected]~ipo+shared build_type=RelWithDebInfo arch=linux-rhel7-skylake
- cp3ofsp ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] f62u3oh ^[email protected]%[email protected]+bz2+ctypes+dbm~debug+libxml2+lzma~nis~optimizations+pic+pyexpat+pythoncmd+readline+shared+sqlite3+ssl~tix~tkinter~ucs4+uuid+zlib patches=0d98e93189bc278fbc37a50ed7f183bd8aaf249a8e1670a465f0db6bb4f8cf87,4c2457325f2b608b1b6a2c63087df8c26e07db3e3d493caf36a56f0ecf6fb768,f2fd060afc4b4618fe8104c4c5d771f36dc55b1db5a4623785a4ea707ec72fb4 arch=linux-rhel7-skylake
[^] bdkvweu ^[email protected]%[email protected]~ipo~profile+shared~tests build_type=RelWithDebInfo arch=linux-rhel7-skylake
$ spack ${SPACK_INSTALL_EXTRA_FLAGS} install -j${SLURM_CPUS_PER_TASK} --log-format=junit --log-file=${CI_PROJECT_DIR}/install.xml --keep-stage ${SPACK_FULL_SPEC} || install_failed=1
==> [email protected] : has external module in ['bison/3.8.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/bison-3.8.2-ej2cew (external bison-3.8.2-6s6wcfeanp2mkdib4c3n3ivkcuosopgm)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/boost-1.78.0-hiorc7 (external boost-1.78.0-vn2t5viljvtb5wyfhgmx57txyeiznxo4)
==> [email protected] : has external module in ['cmake/3.21.4']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/cmake-3.21.4-cdyb7k (external cmake-3.21.4-ucwiakreeghdgbo22kbqqhgnnlwxqtnn)
==> [email protected] : has external module in ['cuda/11.6.1']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/cuda-11.6.1-ngetva (external cuda-11.6.1-gi5x2dn47i7nv76flbq7aatllxpaqmjp)
==> [email protected] : has external module in ['flex/2.6.3']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/flex-2.6.3-3mtl7j (external flex-2.6.3-ajxdymqhnzqgeuhvk5nu5zfymzq35n6i)
==> [email protected] : has external module in ['hpe-mpi/2.25.hmpt']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/hpe-mpi-2.25.hmpt-4ukyxt (external hpe-mpi-2.25.hmpt-pjmdwuu36ioiwqyrg6lrj7nrq3waqjj2)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/pkgconf-1.8.0-6dfyug
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/zlib-1.2.11-mswjss (external zlib-1.2.11-h7lotu6jszcvumwucry2y5jnhvxw5x2d)
==> [email protected] : has external module in ['ninja/1.10.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ninja-1.10.2-eh33rh (external ninja-1.10.2-cp3ofspgkdqaqih5vrhotmdwvkozfswp)
==> [email protected] : has external module in ['python/3.9.7']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/python-3.9.7-yj5alh (external python-3.9.7-f62u3ohchswt5q2b63chawohzqrl6wvy)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/spdlog-1.9.2-snd2rt
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/reportinglib-2.5.6-bdkvwe
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/hdf5-1.10.7-tbfoeg
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/libsonata-report-1.1-vtyvxb
==> Installing coreneuron-develop-knqybtff3rwigwjx34v3sgxt6qlziega
==> No binary for coreneuron-develop-knqybtff3rwigwjx34v3sgxt6qlziega found: installing from source
==> Warning: Expected user 904556 to own /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472, but it is owned by 0
==> No patches needed for coreneuron
==> coreneuron: Executing phase: 'cmake'
==> coreneuron: Executing phase: 'build'
==> coreneuron: Executing phase: 'install'
==> coreneuron: Successfully installed coreneuron-develop-knqybtff3rwigwjx34v3sgxt6qlziega
Fetch: 3.79s. Build: 1m 29.23s. Total: 1m 33.02s.
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/software/install_nvhpc-22.3-skylake/coreneuron-develop-knqybt
$ chmod -R g+rX "${SPACK_BUILD}"
$ if [[ ${install_failed} == 1 ]]; then exit 1; fi
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ ccache --cleanup
$ ccache --show-stats --verbose
Summary:
Cache directory: /gpfs/bbp.cscs.ch/ssd/slurmTmpFS/bbpcihpcproj12/379123/ccache
Primary config: /gpfs/bbp.cscs.ch/ssd/slurmTmpFS/bbpcihpcproj12/379123/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Apr 13 11:53:18 2022
Hits: 109 / 119 (91.60 %)
Direct: 91 / 119 (76.47 %)
Preprocessed: 18 / 28 (64.29 %)
Misses: 10
Direct: 28
Preprocessed: 10
Uncacheable: 19
Primary storage:
Hits: 200 / 238 (84.03 %)
Misses: 38
Cache size (GB): 0.25 / 0.51 (48.65 %)
Files: 8176
Uncacheable:
Called for linking: 16
No input file: 3
$ tar -C "${CCACHE_DIR}" -cf "${CI_PROJECT_DIR}/ccache.tar" .
$ fi
$ touch ${SPACK_STAGE_DIR}/spack-configure-args.txt
$ cp ${SPACK_STAGE_DIR}/spack-{build-env,build-out,configure-args}.txt ${CI_PROJECT_DIR}/
$ echo "SPACK_BUILD_DIR=${SPACK_BUILD_DIR}" > ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_FULL_SPEC=${SPACK_FULL_SPEC}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_SOURCE_DIR=${SPACK_SOURCE_DIR}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_INSTALLED_HASH=${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB=^/${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ num_failures=$(module load unstable python-dev; python -c "from lxml import etree; xml = etree.parse('${CI_PROJECT_DIR}/install.xml'); print(sum(1 for _ in xml.getroot().iter('failure')) + sum(1 for _ in xml.getroot().iter('error')))")
Autoloading python/3.9.7
Autoloading py-virtualenv/16.7.6
Autoloading py-scipy/1.7.1
Autoloading py-numpy/1.19.5
Autoloading py-mpi4py/3.1.2
Autoloading hpe-mpi/2.25.hmpt
Autoloading py-h5py/3.4.0
Autoloading hdf5/1.10.7
$ if [[ ${num_failures} > 0 ]]; then exit ${num_failures}; fi
section_end:1649843604:step_script section_start:1649843604:archive_cache Saving cache for successful job
Creating cache build:coreneuron:mod2c:nvhpc:acc:unified-8...
Runtime platform  arch=amd64 os=linux pid=241845 revision=58ba2b95 version=14.2.0
ccache.tar: found 1 matching files and directories
Uploading cache.zip to https://bbpobjectstorage.epfl.ch/gitlab-runners-cache/project/112/build%3Acoreneuron%3Amod2c%3Anvhpc%3Aacc%3Aunified-8
Created cache
section_end:1649843615:archive_cache section_start:1649843615:upload_artifacts_on_success Uploading artifacts for successful job
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=242175 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
initial_environment.env: found 1 matching files and directories
spack-build-env.txt: found 1 matching files and directories
spack-build-out.txt: found 1 matching files and directories
spack-configure-args.txt: found 1 matching files and directories
Uploading artifacts as "archive" to coordinator... ok id=211020 responseStatus=201 Created token=ihHsmuMo
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=242218 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
Uploading artifacts as "junit" to coordinator... ok id=211020 responseStatus=201 Created token=ihHsmuMo
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=242259 revision=58ba2b95 version=14.2.0
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
Uploading artifacts as "dotenv" to coordinator... ok id=211020 responseStatus=201 Created token=ihHsmuMo
section_end:1649843617:upload_artifacts_on_success section_start:1649843617:cleanup_file_variables Cleaning up project directory and file based variables
section_end:1649843618:cleanup_file_variables Job succeeded
Running with gitlab-runner 14.3.2 (e0218c92)
 on BB5 map runner pnPo3yJy
section_start:1649843397:resolve_secrets Resolving secrets
section_end:1649843397:resolve_secrets section_start:1649843397:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 14.3.2, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor072682831, slurm job id
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 48472, build ref 9a466a6e2a1c807e4d6045f3fc0b1fbaf51370b6, job ID 211019
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472, optional exclusive flag , optional cpus per task flag --cpus-per-task=8, optional qos flag
A slurm job will be created with name GL_J211019_PROD_P112_CP4_C3
Job parameters: memory=76G, cpus_per_task=8, duration=1:00:00, constraint=cpu ntasks=2 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
sbatch: INFO: Job specifies cpu constraint, setting --constraint=[skl|clx]
Submitted batch job 379122
job state: PD
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=2 --cpus-per-task=8 --mem=76G --job-name=GL_J211019_PROD_P112_CP4_C3 -C cpu --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472 --ntasks=2 --jobid=379122 --cpus-per-task=8 --mem=76G
section_end:1649843399:prepare_executor section_start:1649843399:prepare_script Preparing environment
Running on r2i2n20 via bbpv1.epfl.ch...
section_end:1649843403:prepare_script section_start:1649843403:get_sources Getting source from Git repository
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1649843404:get_sources section_start:1649843404:restore_cache Restoring cache
Checking cache for build:coreneuron:mod2c:nvhpc:acc-8...
Runtime platform  arch=amd64 os=linux pid=231924 revision=58ba2b95 version=14.2.0
cache.zip is up to date 
Successfully extracted cache
section_end:1649843407:restore_cache section_start:1649843407:download_artifacts Downloading artifacts
Downloading artifacts for spack_setup (211016)...
Runtime platform  arch=amd64 os=linux pid=232557 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=211016 responseStatus=200 OK token=EkaigVme
section_end:1649843408:download_artifacts section_start:1649843408:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
$ SPACK_BUILD="${PWD}/spack-build"
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ export SPACK_USER_CONFIG_PATH=${PWD}/spack-config
$ mkdir ${SPACK_USER_CONFIG_PATH}
$ cat > ${SPACK_USER_CONFIG_PATH}/config.yaml << END_SCRIPT # collapsed multi-line command
$ spack config blame config
--- config:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211019/spack-config/config.yaml:2 build_stage:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211019/spack-config/config.yaml:3 - /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211019/spack-build
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211019/spack-config/config.yaml:4 source_cache: /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211019/spack-source-cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/config.yaml:2 ccache: True
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:2 install_tree:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:3 root: $user_cache_path/software
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:4 projections:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:5 all: install_{compiler.name}-{compiler.version}-{target}/{name}-{version}-{hash:6}
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:6 module_roots:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:7 tcl: $user_cache_path/modules
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:9 environments:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:10 root: $user_cache_path/environments
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:88 concretization: separately
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:12 build_jobs: 8
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:18 extensions:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:18 - $spack/bluebrain/spack-scripting
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:19 # This is the path to the root of the Spack install tree.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:19 # You can use $spack here to refer to the root of the spack instance.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:37 template_dirs:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:37 - $spack/share/spack/templates
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Temporary locations Spack can try to use for builds.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Recommended options are given below.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Builds can be faster in temporary directories on some (e.g., HPC) systems.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Specifying `$tempdir` will ensure use of the default temporary directory
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # (i.e., ``$TMP` or ``$TMPDIR``).
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Another option that prevents conflicts and potential permission issues is
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # to specify `$user_cache_path/stage`, which ensures each user builds in their
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # home directory.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # A more traditional path uses the value of `$spack/var/spack/stage`, which
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # builds directly inside Spack's instance without staging them in a
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # temporary space. Problems with specifying a path inside a Spack instance
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # are that it precludes its use as a system package and its ability to be
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # pip installable.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # In any case, if the username is not already in the path, Spack will append
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # the value of `$user` in an attempt to avoid potential conflicts between
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # users in shared temporary spaces.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # The build stage can be purged with `spack clean --stage` and
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # `spack clean -a`, so it is important that the specified directory uniquely
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # identifies Spack staging to avoid accidentally wiping out non-Spack work.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:73 test_stage: $user_cache_path/test
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:82 misc_cache: $user_cache_path/cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:94 connect_timeout: 10
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:99 verify_ssl: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:107 suppress_gpg_warnings: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:113 install_missing_compilers: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:118 checksum: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:123 deprecated: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:128 dirty: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:136 build_language: C
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:145 locks: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:150 url_fetch_method: urllib
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:176 concretizer: clingo
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:183 db_lock_timeout: 300
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:191 package_lock_timeout: null
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:197 shared_linking: rpath
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:202 allow_sgid: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:207 terminal_title: False
_builtin debug: False
$ export XDG_CONFIG_HOME=${CI_BUILDS_DIR}/J${CI_JOB_ID}_local_config
$ echo "Configuring git to use CI_JOB_TOKEN to access [email protected] (${XDG_CONFIG_HOME})"
Configuring git to use CI_JOB_TOKEN to access [email protected] (/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472/J211019_local_config)
$ mkdir -p "${XDG_CONFIG_HOME}/git"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = [email protected]:" > "${XDG_CONFIG_HOME}/git/config"
$ cat "${XDG_CONFIG_HOME}/git/config"
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = [email protected]:
$ SPACK_FULL_SPEC="${SPACK_PACKAGE}${SPACK_PACKAGE_COMPILER:+%}${SPACK_PACKAGE_COMPILER} ${SPACK_PACKAGE_SPEC} ${SPACK_PACKAGE_DEPENDENCIES} ${SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB}"
$ echo "Preparing to install ${SPACK_FULL_SPEC}"
Preparing to install coreneuron%nvhpc +gpu+openmp+tests~legacy-unit build_type=RelWithDebInfo
$ JSON_SPEC=$(spack spec --json ${SPACK_FULL_SPEC})
$ SPACK_INSTALLED_HASH=$(module load unstable python; echo "${JSON_SPEC}" | python -c "import json, sys; print(json.loads(sys.stdin.read())[\"spec\"][\"nodes\"][0][\"hash\"])")
$ echo "Determined its hash will be ${SPACK_INSTALLED_HASH}"
Determined its hash will be myaj5gmcll4sywf66372ze5qfubwzr3s
$ SPACK_STAGE_DIR=${SPACK_BUILD}/spack-stage-${SPACK_PACKAGE}-develop-${SPACK_INSTALLED_HASH}
$ SPACK_BUILD_DIR=${SPACK_STAGE_DIR}/spack-build-${SPACK_INSTALLED_HASH:0:7}
$ SPACK_SOURCE_DIR=${SPACK_STAGE_DIR}/spack-src
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ module load unstable ccache
$ export CCACHE_BASEDIR=$(realpath -P ${CI_BUILDS_DIR})
$ echo CCACHE_BASEDIR=${CCACHE_BASEDIR}
CCACHE_BASEDIR=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472
$ export CCACHE_MAXSIZE=512M
$ export CCACHE_DIR="${TMPDIR}/ccache"
$ mkdir -p ${CCACHE_DIR}
$ if [ -f ${CI_PROJECT_DIR}/ccache.tar ]; then
$ tar -C "${CCACHE_DIR}" -xf "${CI_PROJECT_DIR}/ccache.tar"
$ fi
$ ccache --zero-stats
Statistics zeroed
$ ccache --show-stats --verbose
Summary:
Cache directory: /gpfs/bbp.cscs.ch/ssd/slurmTmpFS/bbpcihpcproj12/379122/ccache
Primary config: /gpfs/bbp.cscs.ch/ssd/slurmTmpFS/bbpcihpcproj12/379122/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Apr 13 11:50:39 2022
Hits: 0 / 0
Direct: 0 / 0
Preprocessed: 0 / 0
Misses: 0
Direct: 0
Preprocessed: 0
Primary storage:
Hits: 0 / 0
Misses: 0
Cache size (GB): 0.29 / 0.51 (57.10 %)
Files: 9522
$ fi
$ module load unstable git ${SPACK_EXTRA_MODULES}
$ spack uninstall -y --dependents /${SPACK_INSTALLED_HASH} || true
==> Error: No installed spec matches the hash: 'myaj5gmcll4sywf66372ze5qfubwzr3s'
$ spack spec -Il ${SPACK_FULL_SPEC}
Input spec
--------------------------------
- coreneuron%nvhpc+gpu~legacy-unit+openmp+tests build_type=RelWithDebInfo
Concretized
--------------------------------
- myaj5gm coreneuron@develop%[email protected]~caliper~codegenopt+gpu~ipo~ispc~knl~legacy-unit+mpi~nmodl+openmp~profile+report+shared~sympy~sympyopt+tests~unified build_type=RelWithDebInfo arch=linux-rhel7-skylake
- 6s6wcfe ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] vn2t5vi ^[email protected]%[email protected]+atomic+chrono~clanglibcpp~container~context~coroutine+date_time~debug+exception~fiber+filesystem+graph~icu+iostreams+locale+log+math~mpi+multithreaded~numpy+pic+program_options~python+random+regex+serialization+shared+signals~singlethreaded+system~taggedlayout+test+thread+timer~versionedlayout+wave cxxstd=98 visibility=hidden arch=linux-rhel7-skylake
- ucwiakr ^[email protected]%[email protected]~doc+ncurses+openssl+ownlibs~qt build_type=Release arch=linux-rhel7-skylake
[^] gi5x2dn ^[email protected]%[email protected]~allow-unsupported-compilers~dev arch=linux-rhel7-skylake
- ajxdymq ^[email protected]%[email protected]+lex~nls arch=linux-rhel7-skylake
[^] pjmdwuu ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] vtyvxbi ^[email protected]%[email protected]~ipo+mpi build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] tbfoeg7 ^[email protected]%[email protected]+cxx~fortran+hl~ipo~java+mpi+shared~szip~threadsafe+tools api=default build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] 6dfyugs ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] h7lotu6 ^[email protected]%[email protected]+optimize+pic+shared arch=linux-rhel7-skylake
[^] snd2rt6 ^[email protected]%[email protected]~ipo+shared build_type=RelWithDebInfo arch=linux-rhel7-skylake
- cp3ofsp ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] f62u3oh ^[email protected]%[email protected]+bz2+ctypes+dbm~debug+libxml2+lzma~nis~optimizations+pic+pyexpat+pythoncmd+readline+shared+sqlite3+ssl~tix~tkinter~ucs4+uuid+zlib patches=0d98e93189bc278fbc37a50ed7f183bd8aaf249a8e1670a465f0db6bb4f8cf87,4c2457325f2b608b1b6a2c63087df8c26e07db3e3d493caf36a56f0ecf6fb768,f2fd060afc4b4618fe8104c4c5d771f36dc55b1db5a4623785a4ea707ec72fb4 arch=linux-rhel7-skylake
[^] bdkvweu ^[email protected]%[email protected]~ipo~profile+shared~tests build_type=RelWithDebInfo arch=linux-rhel7-skylake
$ spack ${SPACK_INSTALL_EXTRA_FLAGS} install -j${SLURM_CPUS_PER_TASK} --log-format=junit --log-file=${CI_PROJECT_DIR}/install.xml --keep-stage ${SPACK_FULL_SPEC} || install_failed=1
==> [email protected] : has external module in ['bison/3.8.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/bison-3.8.2-ej2cew (external bison-3.8.2-6s6wcfeanp2mkdib4c3n3ivkcuosopgm)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/boost-1.78.0-hiorc7 (external boost-1.78.0-vn2t5viljvtb5wyfhgmx57txyeiznxo4)
==> [email protected] : has external module in ['cmake/3.21.4']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/cmake-3.21.4-cdyb7k (external cmake-3.21.4-ucwiakreeghdgbo22kbqqhgnnlwxqtnn)
==> [email protected] : has external module in ['cuda/11.6.1']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/cuda-11.6.1-ngetva (external cuda-11.6.1-gi5x2dn47i7nv76flbq7aatllxpaqmjp)
==> [email protected] : has external module in ['flex/2.6.3']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/flex-2.6.3-3mtl7j (external flex-2.6.3-ajxdymqhnzqgeuhvk5nu5zfymzq35n6i)
==> [email protected] : has external module in ['hpe-mpi/2.25.hmpt']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/hpe-mpi-2.25.hmpt-4ukyxt (external hpe-mpi-2.25.hmpt-pjmdwuu36ioiwqyrg6lrj7nrq3waqjj2)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/pkgconf-1.8.0-6dfyug
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/zlib-1.2.11-mswjss (external zlib-1.2.11-h7lotu6jszcvumwucry2y5jnhvxw5x2d)
==> [email protected] : has external module in ['ninja/1.10.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ninja-1.10.2-eh33rh (external ninja-1.10.2-cp3ofspgkdqaqih5vrhotmdwvkozfswp)
==> [email protected] : has external module in ['python/3.9.7']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/python-3.9.7-yj5alh (external python-3.9.7-f62u3ohchswt5q2b63chawohzqrl6wvy)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/spdlog-1.9.2-snd2rt
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/reportinglib-2.5.6-bdkvwe
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/hdf5-1.10.7-tbfoeg
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/libsonata-report-1.1-vtyvxb
==> Installing coreneuron-develop-myaj5gmcll4sywf66372ze5qfubwzr3s
==> No binary for coreneuron-develop-myaj5gmcll4sywf66372ze5qfubwzr3s found: installing from source
==> Warning: Expected user 904556 to own /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472, but it is owned by 0
==> No patches needed for coreneuron
==> coreneuron: Executing phase: 'cmake'
==> coreneuron: Executing phase: 'build'
==> coreneuron: Executing phase: 'install'
==> coreneuron: Successfully installed coreneuron-develop-myaj5gmcll4sywf66372ze5qfubwzr3s
Fetch: 3.73s. Build: 1m 31.24s. Total: 1m 34.97s.
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/software/install_nvhpc-22.3-skylake/coreneuron-develop-myaj5g
$ chmod -R g+rX "${SPACK_BUILD}"
$ if [[ ${install_failed} == 1 ]]; then exit 1; fi
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ ccache --cleanup
$ ccache --show-stats --verbose
Summary:
Cache directory: /gpfs/bbp.cscs.ch/ssd/slurmTmpFS/bbpcihpcproj12/379122/ccache
Primary config: /gpfs/bbp.cscs.ch/ssd/slurmTmpFS/bbpcihpcproj12/379122/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Apr 13 11:53:18 2022
Hits: 109 / 119 (91.60 %)
Direct: 91 / 119 (76.47 %)
Preprocessed: 18 / 28 (64.29 %)
Misses: 10
Direct: 28
Preprocessed: 10
Uncacheable: 19
Primary storage:
Hits: 200 / 238 (84.03 %)
Misses: 38
Cache size (GB): 0.28 / 0.51 (55.54 %)
Files: 9542
Uncacheable:
Called for linking: 16
No input file: 3
$ tar -C "${CCACHE_DIR}" -cf "${CI_PROJECT_DIR}/ccache.tar" .
$ fi
$ touch ${SPACK_STAGE_DIR}/spack-configure-args.txt
$ cp ${SPACK_STAGE_DIR}/spack-{build-env,build-out,configure-args}.txt ${CI_PROJECT_DIR}/
$ echo "SPACK_BUILD_DIR=${SPACK_BUILD_DIR}" > ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_FULL_SPEC=${SPACK_FULL_SPEC}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_SOURCE_DIR=${SPACK_SOURCE_DIR}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_INSTALLED_HASH=${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB=^/${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ num_failures=$(module load unstable python-dev; python -c "from lxml import etree; xml = etree.parse('${CI_PROJECT_DIR}/install.xml'); print(sum(1 for _ in xml.getroot().iter('failure')) + sum(1 for _ in xml.getroot().iter('error')))")
Autoloading python/3.9.7
Autoloading py-virtualenv/16.7.6
Autoloading py-scipy/1.7.1
Autoloading py-numpy/1.19.5
Autoloading py-mpi4py/3.1.2
Autoloading hpe-mpi/2.25.hmpt
Autoloading py-h5py/3.4.0
Autoloading hdf5/1.10.7
$ if [[ ${num_failures} > 0 ]]; then exit ${num_failures}; fi
section_end:1649843604:step_script section_start:1649843604:archive_cache Saving cache for successful job
Creating cache build:coreneuron:mod2c:nvhpc:acc-8...
Runtime platform  arch=amd64 os=linux pid=241903 revision=58ba2b95 version=14.2.0
ccache.tar: found 1 matching files and directories
Uploading cache.zip to https://bbpobjectstorage.epfl.ch/gitlab-runners-cache/project/112/build%3Acoreneuron%3Amod2c%3Anvhpc%3Aacc-8
Created cache
section_end:1649843618:archive_cache section_start:1649843618:upload_artifacts_on_success Uploading artifacts for successful job
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=242373 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
initial_environment.env: found 1 matching files and directories
spack-build-env.txt: found 1 matching files and directories
spack-build-out.txt: found 1 matching files and directories
spack-configure-args.txt: found 1 matching files and directories
Uploading artifacts as "archive" to coordinator... ok id=211019 responseStatus=201 Created token=Qw7KHaR5
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=242416 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
Uploading artifacts as "junit" to coordinator... ok id=211019 responseStatus=201 Created token=Qw7KHaR5
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=242467 revision=58ba2b95 version=14.2.0
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
Uploading artifacts as "dotenv" to coordinator... ok id=211019 responseStatus=201 Created token=Qw7KHaR5
section_end:1649843620:upload_artifacts_on_success section_start:1649843620:cleanup_file_variables Cleaning up project directory and file based variables
section_end:1649843621:cleanup_file_variables Job succeeded
Running with gitlab-runner 14.3.2 (e0218c92)
 on BB5 map runner pnPo3yJy
section_start:1649843665:resolve_secrets Resolving secrets
section_end:1649843665:resolve_secrets section_start:1649843665:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 14.3.2, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor463277424, slurm job id
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 48472, build ref 9a466a6e2a1c807e4d6045f3fc0b1fbaf51370b6, job ID 211024
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472, optional exclusive flag , optional cpus per task flag --cpus-per-task=8, optional qos flag
A slurm job will be created with name GL_J211024_PROD_P112_CP9_C16
Job parameters: memory=76G, cpus_per_task=8, duration=1:00:00, constraint=cpu ntasks=2 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
sbatch: INFO: Job specifies cpu constraint, setting --constraint=[skl|clx]
Submitted batch job 379214
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=2 --cpus-per-task=8 --mem=76G --job-name=GL_J211024_PROD_P112_CP9_C16 -C cpu --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472 --ntasks=2 --jobid=379214 --cpus-per-task=8 --mem=76G
section_end:1649843667:prepare_executor section_start:1649843667:prepare_script Preparing environment
Running on r5i0n34 via bbpv1.epfl.ch...
section_end:1649843669:prepare_script section_start:1649843669:get_sources Getting source from Git repository
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1649843670:get_sources section_start:1649843670:restore_cache Restoring cache
Checking cache for build:coreneuron:nmodl:intel-8...
Runtime platform  arch=amd64 os=linux pid=27714 revision=58ba2b95 version=14.2.0
cache.zip is up to date 
Successfully extracted cache
section_end:1649843691:restore_cache section_start:1649843691:download_artifacts Downloading artifacts
Downloading artifacts for build:nmodl (211018)...
Runtime platform  arch=amd64 os=linux pid=28511 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=211018 responseStatus=200 OK token=8Sk4MWSK
section_end:1649843692:download_artifacts section_start:1649843692:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
$ SPACK_BUILD="${PWD}/spack-build"
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ export SPACK_USER_CONFIG_PATH=${PWD}/spack-config
$ mkdir ${SPACK_USER_CONFIG_PATH}
$ cat > ${SPACK_USER_CONFIG_PATH}/config.yaml << END_SCRIPT # collapsed multi-line command
$ spack config blame config
--- config:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211024/spack-config/config.yaml:2 build_stage:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211024/spack-config/config.yaml:3 - /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211024/spack-build
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211024/spack-config/config.yaml:4 source_cache: /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211024/spack-source-cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/config.yaml:2 ccache: True
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:2 install_tree:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:3 root: $user_cache_path/software
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:4 projections:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:5 all: install_{compiler.name}-{compiler.version}-{target}/{name}-{version}-{hash:6}
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:6 module_roots:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:7 tcl: $user_cache_path/modules
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:9 environments:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:10 root: $user_cache_path/environments
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:88 concretization: separately
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:12 build_jobs: 8
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:18 extensions:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:18 - $spack/bluebrain/spack-scripting
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:19 # This is the path to the root of the Spack install tree.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:19 # You can use $spack here to refer to the root of the spack instance.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:37 template_dirs:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:37 - $spack/share/spack/templates
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Temporary locations Spack can try to use for builds.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Recommended options are given below.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Builds can be faster in temporary directories on some (e.g., HPC) systems.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Specifying `$tempdir` will ensure use of the default temporary directory
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # (i.e., ``$TMP` or ``$TMPDIR``).
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Another option that prevents conflicts and potential permission issues is
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # to specify `$user_cache_path/stage`, which ensures each user builds in their
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # home directory.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # A more traditional path uses the value of `$spack/var/spack/stage`, which
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # builds directly inside Spack's instance without staging them in a
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # temporary space. Problems with specifying a path inside a Spack instance
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # are that it precludes its use as a system package and its ability to be
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # pip installable.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # In any case, if the username is not already in the path, Spack will append
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # the value of `$user` in an attempt to avoid potential conflicts between
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # users in shared temporary spaces.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # The build stage can be purged with `spack clean --stage` and
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # `spack clean -a`, so it is important that the specified directory uniquely
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # identifies Spack staging to avoid accidentally wiping out non-Spack work.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:73 test_stage: $user_cache_path/test
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:82 misc_cache: $user_cache_path/cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:94 connect_timeout: 10
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:99 verify_ssl: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:107 suppress_gpg_warnings: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:113 install_missing_compilers: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:118 checksum: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:123 deprecated: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:128 dirty: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:136 build_language: C
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:145 locks: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:150 url_fetch_method: urllib
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:176 concretizer: clingo
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:183 db_lock_timeout: 300
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:191 package_lock_timeout: null
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:197 shared_linking: rpath
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:202 allow_sgid: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:207 terminal_title: False
_builtin debug: False
$ export XDG_CONFIG_HOME=${CI_BUILDS_DIR}/J${CI_JOB_ID}_local_config
$ echo "Configuring git to use CI_JOB_TOKEN to access [email protected] (${XDG_CONFIG_HOME})"
Configuring git to use CI_JOB_TOKEN to access [email protected] (/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472/J211024_local_config)
$ mkdir -p "${XDG_CONFIG_HOME}/git"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = [email protected]:" > "${XDG_CONFIG_HOME}/git/config"
$ cat "${XDG_CONFIG_HOME}/git/config"
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = [email protected]:
$ SPACK_FULL_SPEC="${SPACK_PACKAGE}${SPACK_PACKAGE_COMPILER:+%}${SPACK_PACKAGE_COMPILER} ${SPACK_PACKAGE_SPEC} ${SPACK_PACKAGE_DEPENDENCIES} ${SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB}"
$ echo "Preparing to install ${SPACK_FULL_SPEC}"
Preparing to install coreneuron%intel +nmodl+tests~legacy-unit build_type=Debug ^hpe-mpi%gcc ^/7hnncpjzlyqwesmjkszimpr4c2qteocr
$ JSON_SPEC=$(spack spec --json ${SPACK_FULL_SPEC})
$ SPACK_INSTALLED_HASH=$(module load unstable python; echo "${JSON_SPEC}" | python -c "import json, sys; print(json.loads(sys.stdin.read())[\"spec\"][\"nodes\"][0][\"hash\"])")
$ echo "Determined its hash will be ${SPACK_INSTALLED_HASH}"
Determined its hash will be x2m7t35nswt5wf62dii5jjfu45snneyv
$ SPACK_STAGE_DIR=${SPACK_BUILD}/spack-stage-${SPACK_PACKAGE}-develop-${SPACK_INSTALLED_HASH}
$ SPACK_BUILD_DIR=${SPACK_STAGE_DIR}/spack-build-${SPACK_INSTALLED_HASH:0:7}
$ SPACK_SOURCE_DIR=${SPACK_STAGE_DIR}/spack-src
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ module load unstable ccache
$ export CCACHE_BASEDIR=$(realpath -P ${CI_BUILDS_DIR})
$ echo CCACHE_BASEDIR=${CCACHE_BASEDIR}
CCACHE_BASEDIR=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472
$ export CCACHE_MAXSIZE=512M
$ export CCACHE_DIR="${TMPDIR}/ccache"
$ mkdir -p ${CCACHE_DIR}
$ if [ -f ${CI_PROJECT_DIR}/ccache.tar ]; then
$ tar -C "${CCACHE_DIR}" -xf "${CI_PROJECT_DIR}/ccache.tar"
$ fi
$ ccache --zero-stats
Statistics zeroed
$ ccache --show-stats --verbose
Summary:
Cache directory: /gpfs/bbp.cscs.ch/ssd/slurmTmpFS/bbpcihpcproj12/379214/ccache
Primary config: /gpfs/bbp.cscs.ch/ssd/slurmTmpFS/bbpcihpcproj12/379214/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Apr 13 11:56:10 2022
Hits: 0 / 0
Direct: 0 / 0
Preprocessed: 0 / 0
Misses: 0
Direct: 0
Preprocessed: 0
Primary storage:
Hits: 0 / 0
Misses: 0
Cache size (GB): 0.41 / 0.51 (80.45 %)
Files: 8107
$ fi
$ module load unstable git ${SPACK_EXTRA_MODULES}
$ spack uninstall -y --dependents /${SPACK_INSTALLED_HASH} || true
==> Error: No installed spec matches the hash: 'x2m7t35nswt5wf62dii5jjfu45snneyv'
$ spack spec -Il ${SPACK_FULL_SPEC}
Input spec
--------------------------------
- coreneuron%intel~legacy-unit+nmodl+tests build_type=Debug
- ^hpe-mpi%gcc
[+] ^nmodl@develop%[email protected]~ipo~legacy-unit~llvm~llvm_cuda~python build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]~i18n arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]+bz2+ctypes+dbm~debug+libxml2+lzma~nis~optimizations+pic+pyexpat+pythoncmd+readline+shared+sqlite3+ssl~tix~tkinter~ucs4+uuid+zlib patches=0d98e93189bc278fbc37a50ed7f183bd8aaf249a8e1670a465f0db6bb4f8cf87,4c2457325f2b608b1b6a2c63087df8c26e07db3e3d493caf36a56f0ecf6fb768,f2fd060afc4b4618fe8104c4c5d771f36dc55b1db5a4623785a4ea707ec72fb4 arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]+libyaml arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
Concretized
--------------------------------
- x2m7t35 coreneuron@develop%[email protected]~caliper~codegenopt~gpu~ipo~ispc~knl~legacy-unit+mpi+nmodl~openmp~profile+report+shared~sympy~sympyopt+tests~unified build_type=Debug arch=linux-rhel7-skylake
- dwdch6b ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] dl7pfht ^[email protected]%[email protected]+atomic+chrono~clanglibcpp~container~context~coroutine+date_time~debug+exception~fiber+filesystem+graph~icu+iostreams+locale+log+math~mpi+multithreaded~numpy+pic+program_options~python+random+regex+serialization+shared+signals~singlethreaded+system~taggedlayout+test+thread+timer~versionedlayout+wave cxxstd=98 visibility=hidden arch=linux-rhel7-skylake
[^] 2qmvlfy ^[email protected]%[email protected]~doc+ncurses+openssl+ownlibs~qt build_type=Release arch=linux-rhel7-skylake
- s4ueg72 ^[email protected]%[email protected]+lex~nls arch=linux-rhel7-skylake
[^] hyunzkn ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] vutuzbn ^[email protected]%[email protected]~ipo+mpi build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] utj47fz ^[email protected]%[email protected]+cxx~fortran+hl~ipo~java+mpi+shared~szip~threadsafe+tools api=default build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ihxi5rl ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] evtnqzd ^[email protected]%[email protected]+optimize+pic+shared arch=linux-rhel7-skylake
[^] x72vveu ^[email protected]%[email protected]~ipo+shared build_type=RelWithDebInfo arch=linux-rhel7-skylake
- 6ggc5yr ^[email protected]%[email protected] arch=linux-rhel7-skylake
[+] 7hnncpj ^nmodl@develop%[email protected]~ipo~legacy-unit~llvm~llvm_cuda~python build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] onq3mhg ^[email protected]%[email protected]~i18n arch=linux-rhel7-skylake
[^] 4ectow5 ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] 72xzp3v ^[email protected]%[email protected]+bz2+ctypes+dbm~debug+libxml2+lzma~nis~optimizations+pic+pyexpat+pythoncmd+readline+shared+sqlite3+ssl~tix~tkinter~ucs4+uuid+zlib patches=0d98e93189bc278fbc37a50ed7f183bd8aaf249a8e1670a465f0db6bb4f8cf87,4c2457325f2b608b1b6a2c63087df8c26e07db3e3d493caf36a56f0ecf6fb768,f2fd060afc4b4618fe8104c4c5d771f36dc55b1db5a4623785a4ea707ec72fb4 arch=linux-rhel7-skylake
[^] 22arfs4 ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ascbeii ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] kvw3vhm ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] 7iyiygo ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] c7qvw2q ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] mazoiox ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] y7rfzdj ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] v4z3s5e ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] w4gddqx ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ci5oe5b ^[email protected]%[email protected]+libyaml arch=linux-rhel7-skylake
[^] xj2wlac ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] xtett6q ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] dzb2mfs ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] 5fkun4i ^[email protected]%[email protected]~ipo~profile+shared~tests build_type=RelWithDebInfo arch=linux-rhel7-skylake
$ spack ${SPACK_INSTALL_EXTRA_FLAGS} install -j${SLURM_CPUS_PER_TASK} --log-format=junit --log-file=${CI_PROJECT_DIR}/install.xml --keep-stage ${SPACK_FULL_SPEC} || install_failed=1
==> [email protected] : has external module in ['bison/3.8.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/bison-3.8.2-ej2cew (external bison-3.8.2-dwdch6bmdeclr2novthsywtrryotawwz)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/boost-1.78.0-hiorc7 (external boost-1.78.0-dl7pfhtsvyaq3273gr2g3l5vr37eeydc)
==> [email protected] : has external module in ['cmake/3.21.4']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/cmake-3.21.4-cdyb7k (external cmake-3.21.4-2qmvlfyylrv3t5ankluyr5cqey2nlfzd)
==> [email protected] : has external module in ['flex/2.6.3']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/flex-2.6.3-3mtl7j (external flex-2.6.3-s4ueg72j7l6vkdyvfxj2tweo7v7s3otx)
==> [email protected] : has external module in ['hpe-mpi/2.25.hmpt']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/hpe-mpi-2.25.hmpt-4ukyxt (external hpe-mpi-2.25.hmpt-hyunzknktdzri34bx26bupkvapehonql)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/pkgconf-1.8.0-ihxi5r
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/zlib-1.2.11-mswjss (external zlib-1.2.11-evtnqzdtuvprwnbd2nljimpzywrhw4uo)
==> [email protected] : has external module in ['ninja/1.10.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ninja-1.10.2-eh33rh (external ninja-1.10.2-6ggc5yre7qddwxdjmn7sfptpdoiy4dtp)
==> [email protected] : has external module in ['python/3.9.7']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/python-3.9.7-yj5alh (external python-3.9.7-72xzp3vgzh5b424qevkfruhs3ajzqbiy)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/libyaml-0.2.5-xj2wla
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/spdlog-1.9.2-x72vve
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/reportinglib-2.5.6-5fkun4
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/hdf5-1.10.7-utj47f
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-markupsafe-2.0.1-4ectow
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-attrs-21.2.0-ascbei
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-py-1.9.0-y7rfzd
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-toml-0.10.2-w4gddq
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-mpmath-1.1.0-dzb2mf
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pluggy-0.13.0-mazoio
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-iniconfig-1.1.1-kvw3vh
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-setuptools-57.4.0-v4z3s5
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pyparsing-2.4.7-c7qvw2
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pyyaml-5.3.1-ci5oe5
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/libsonata-report-1.1-vutuzb
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-jinja2-3.0.1-onq3mh
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-sympy-1.8-xtett6
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-packaging-21.0-7iyiyg
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pytest-6.2.4-22arfs
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/software/install_gcc-11.2.0-skylake/nmodl-develop-7hnncp
==> Installing coreneuron-develop-x2m7t35nswt5wf62dii5jjfu45snneyv
==> No binary for coreneuron-develop-x2m7t35nswt5wf62dii5jjfu45snneyv found: installing from source
==> Warning: Expected user 904556 to own /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472, but it is owned by 0
==> No patches needed for coreneuron
==> coreneuron: Executing phase: 'cmake'
==> coreneuron: Executing phase: 'build'
==> coreneuron: Executing phase: 'install'
==> coreneuron: Successfully installed coreneuron-develop-x2m7t35nswt5wf62dii5jjfu45snneyv
Fetch: 4.32s. Build: 39.11s. Total: 43.43s.
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/software/install_intel-2021.4.0-skylake/coreneuron-develop-x2m7t3
$ chmod -R g+rX "${SPACK_BUILD}"
$ if [[ ${install_failed} == 1 ]]; then exit 1; fi
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ ccache --cleanup
$ ccache --show-stats --verbose
Summary:
Cache directory: /gpfs/bbp.cscs.ch/ssd/slurmTmpFS/bbpcihpcproj12/379214/ccache
Primary config: /gpfs/bbp.cscs.ch/ssd/slurmTmpFS/bbpcihpcproj12/379214/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Apr 13 11:59:09 2022
Hits: 88 / 95 (92.63 %)
Direct: 79 / 95 (83.16 %)
Preprocessed: 9 / 16 (56.25 %)
Misses: 7
Direct: 16
Preprocessed: 7
Uncacheable: 16
Primary storage:
Hits: 167 / 190 (87.89 %)
Misses: 23
Cache size (GB): 0.42 / 0.51 (82.25 %)
Files: 8128
Uncacheable:
Called for linking: 13
No input file: 3
$ tar -C "${CCACHE_DIR}" -cf "${CI_PROJECT_DIR}/ccache.tar" .
$ fi
$ touch ${SPACK_STAGE_DIR}/spack-configure-args.txt
$ cp ${SPACK_STAGE_DIR}/spack-{build-env,build-out,configure-args}.txt ${CI_PROJECT_DIR}/
$ echo "SPACK_BUILD_DIR=${SPACK_BUILD_DIR}" > ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_FULL_SPEC=${SPACK_FULL_SPEC}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_SOURCE_DIR=${SPACK_SOURCE_DIR}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_INSTALLED_HASH=${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB=^/${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ num_failures=$(module load unstable python-dev; python -c "from lxml import etree; xml = etree.parse('${CI_PROJECT_DIR}/install.xml'); print(sum(1 for _ in xml.getroot().iter('failure')) + sum(1 for _ in xml.getroot().iter('error')))")
Autoloading python/3.9.7
Autoloading py-virtualenv/16.7.6
Autoloading py-scipy/1.7.1
Autoloading py-numpy/1.19.5
Autoloading py-mpi4py/3.1.2
Autoloading hpe-mpi/2.25.hmpt
Autoloading py-h5py/3.4.0
Autoloading hdf5/1.10.7
$ if [[ ${num_failures} > 0 ]]; then exit ${num_failures}; fi
section_end:1649843960:step_script section_start:1649843960:archive_cache Saving cache for successful job
Creating cache build:coreneuron:nmodl:intel-8...
Runtime platform  arch=amd64 os=linux pid=35991 revision=58ba2b95 version=14.2.0
ccache.tar: found 1 matching files and directories
Uploading cache.zip to https://bbpobjectstorage.epfl.ch/gitlab-runners-cache/project/112/build%3Acoreneuron%3Anmodl%3Aintel-8
Created cache
section_end:1649843976:archive_cache section_start:1649843976:upload_artifacts_on_success Uploading artifacts for successful job
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=36403 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
initial_environment.env: found 1 matching files and directories
spack-build-env.txt: found 1 matching files and directories
spack-build-out.txt: found 1 matching files and directories
spack-configure-args.txt: found 1 matching files and directories
Uploading artifacts as "archive" to coordinator... ok id=211024 responseStatus=201 Created token=ym6v9xs-
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=36445 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
Uploading artifacts as "junit" to coordinator... ok id=211024 responseStatus=201 Created token=ym6v9xs-
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=36507 revision=58ba2b95 version=14.2.0
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
Uploading artifacts as "dotenv" to coordinator... ok id=211024 responseStatus=201 Created token=ym6v9xs-
section_end:1649843978:upload_artifacts_on_success section_start:1649843978:cleanup_file_variables Cleaning up project directory and file based variables
section_end:1649843978:cleanup_file_variables Job succeeded
Running with gitlab-runner 14.3.2 (e0218c92)
 on BB5 map runner pnPo3yJy
section_start:1649843665:resolve_secrets Resolving secrets
section_end:1649843665:resolve_secrets section_start:1649843665:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 14.3.2, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor368966414, slurm job id
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 48472, build ref 9a466a6e2a1c807e4d6045f3fc0b1fbaf51370b6, job ID 211022
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472, optional exclusive flag , optional cpus per task flag --cpus-per-task=8, optional qos flag
A slurm job will be created with name GL_J211022_PROD_P112_CP8_C7
Job parameters: memory=76G, cpus_per_task=8, duration=1:00:00, constraint=cpu ntasks=2 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
sbatch: INFO: Job specifies cpu constraint, setting --constraint=[skl|clx]
Submitted batch job 379213
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=2 --cpus-per-task=8 --mem=76G --job-name=GL_J211022_PROD_P112_CP8_C7 -C cpu --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472 --ntasks=2 --jobid=379213 --cpus-per-task=8 --mem=76G
section_end:1649843666:prepare_executor section_start:1649843666:prepare_script Preparing environment
Running on r1i4n0 via bbpv1.epfl.ch...
section_end:1649843671:prepare_script section_start:1649843671:get_sources Getting source from Git repository
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1649843672:get_sources section_start:1649843672:restore_cache Restoring cache
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:53356]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:53400]: Insane message length
Checking cache for build:coreneuron:nmodl:nvhpc:acc-8...
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:53450]: Insane message length
Runtime platform  arch=amd64 os=linux pid=36764 revision=58ba2b95 version=14.2.0
cache.zip is up to date 
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:53478]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:53518]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:53566]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:53610]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:53662]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:53700]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:53728]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:55972]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:56042]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:56086]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:56122]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:56154]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:56190]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:56414]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:56448]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:56506]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:56548]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:56584]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:56618]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:56654]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:56700]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:56734]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:56786]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:56822]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:56878]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:56918]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:56946]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:59206]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:59239]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:59284]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:59322]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:59356]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:59388]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:59428]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:59469]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:59502]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:59576]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:59630]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:59650]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:59690]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:59723]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:59760]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:59786]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:59842]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:59884]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:59928]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:59956]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:59992]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:60054]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:60114]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:60144]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:33042]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:33082]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:33120]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:33150]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:33191]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:33252]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:33286]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:33319]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:33357]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:33398]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:33440]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:33480]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:33534]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:33568]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:33618]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:33658]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:33688]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:34878]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:34908]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:34976]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:35012]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:35042]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:35100]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:35142]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:35188]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:35222]: Insane message length
srun: error: eio_message_socket_accept: slurm_receive_msg[127.0.0.1:36424]: Insane message length
Successfully extracted cache
section_end:1649843684:restore_cache section_start:1649843684:download_artifacts Downloading artifacts
Downloading artifacts for build:nmodl (211018)...
Runtime platform  arch=amd64 os=linux pid=37070 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=211018 responseStatus=200 OK token=8Sk4MWSK
section_end:1649843685:download_artifacts section_start:1649843685:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
$ SPACK_BUILD="${PWD}/spack-build"
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ export SPACK_USER_CONFIG_PATH=${PWD}/spack-config
$ mkdir ${SPACK_USER_CONFIG_PATH}
$ cat > ${SPACK_USER_CONFIG_PATH}/config.yaml << END_SCRIPT # collapsed multi-line command
$ spack config blame config
--- config:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211022/spack-config/config.yaml:2 build_stage:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211022/spack-config/config.yaml:3 - /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211022/spack-build
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211022/spack-config/config.yaml:4 source_cache: /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211022/spack-source-cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/config.yaml:2 ccache: True
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:2 install_tree:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:3 root: $user_cache_path/software
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:4 projections:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:5 all: install_{compiler.name}-{compiler.version}-{target}/{name}-{version}-{hash:6}
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:6 module_roots:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:7 tcl: $user_cache_path/modules
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:9 environments:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:10 root: $user_cache_path/environments
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:88 concretization: separately
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:12 build_jobs: 8
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:18 extensions:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:18 - $spack/bluebrain/spack-scripting
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:19 # This is the path to the root of the Spack install tree.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:19 # You can use $spack here to refer to the root of the spack instance.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:37 template_dirs:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:37 - $spack/share/spack/templates
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Temporary locations Spack can try to use for builds.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Recommended options are given below.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Builds can be faster in temporary directories on some (e.g., HPC) systems.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Specifying `$tempdir` will ensure use of the default temporary directory
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # (i.e., ``$TMP` or ``$TMPDIR``).
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Another option that prevents conflicts and potential permission issues is
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # to specify `$user_cache_path/stage`, which ensures each user builds in their
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # home directory.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # A more traditional path uses the value of `$spack/var/spack/stage`, which
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # builds directly inside Spack's instance without staging them in a
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # temporary space. Problems with specifying a path inside a Spack instance
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # are that it precludes its use as a system package and its ability to be
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # pip installable.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # In any case, if the username is not already in the path, Spack will append
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # the value of `$user` in an attempt to avoid potential conflicts between
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # users in shared temporary spaces.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # The build stage can be purged with `spack clean --stage` and
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # `spack clean -a`, so it is important that the specified directory uniquely
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # identifies Spack staging to avoid accidentally wiping out non-Spack work.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:73 test_stage: $user_cache_path/test
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:82 misc_cache: $user_cache_path/cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:94 connect_timeout: 10
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:99 verify_ssl: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:107 suppress_gpg_warnings: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:113 install_missing_compilers: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:118 checksum: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:123 deprecated: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:128 dirty: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:136 build_language: C
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:145 locks: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:150 url_fetch_method: urllib
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:176 concretizer: clingo
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:183 db_lock_timeout: 300
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:191 package_lock_timeout: null
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:197 shared_linking: rpath
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:202 allow_sgid: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:207 terminal_title: False
_builtin debug: False
$ export XDG_CONFIG_HOME=${CI_BUILDS_DIR}/J${CI_JOB_ID}_local_config
$ echo "Configuring git to use CI_JOB_TOKEN to access [email protected] (${XDG_CONFIG_HOME})"
Configuring git to use CI_JOB_TOKEN to access [email protected] (/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472/J211022_local_config)
$ mkdir -p "${XDG_CONFIG_HOME}/git"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = [email protected]:" > "${XDG_CONFIG_HOME}/git/config"
$ cat "${XDG_CONFIG_HOME}/git/config"
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = [email protected]:
$ SPACK_FULL_SPEC="${SPACK_PACKAGE}${SPACK_PACKAGE_COMPILER:+%}${SPACK_PACKAGE_COMPILER} ${SPACK_PACKAGE_SPEC} ${SPACK_PACKAGE_DEPENDENCIES} ${SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB}"
$ echo "Preparing to install ${SPACK_FULL_SPEC}"
Preparing to install coreneuron%nvhpc +nmodl~openmp+gpu+tests~legacy-unit+sympy build_type=RelWithDebInfo ^hpe-mpi%gcc ^/7hnncpjzlyqwesmjkszimpr4c2qteocr
$ JSON_SPEC=$(spack spec --json ${SPACK_FULL_SPEC})
$ SPACK_INSTALLED_HASH=$(module load unstable python; echo "${JSON_SPEC}" | python -c "import json, sys; print(json.loads(sys.stdin.read())[\"spec\"][\"nodes\"][0][\"hash\"])")
$ echo "Determined its hash will be ${SPACK_INSTALLED_HASH}"
Determined its hash will be 7eavmcfdqg53mknhnfbcd3xnco2pxk7o
$ SPACK_STAGE_DIR=${SPACK_BUILD}/spack-stage-${SPACK_PACKAGE}-develop-${SPACK_INSTALLED_HASH}
$ SPACK_BUILD_DIR=${SPACK_STAGE_DIR}/spack-build-${SPACK_INSTALLED_HASH:0:7}
$ SPACK_SOURCE_DIR=${SPACK_STAGE_DIR}/spack-src
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ module load unstable ccache
$ export CCACHE_BASEDIR=$(realpath -P ${CI_BUILDS_DIR})
$ echo CCACHE_BASEDIR=${CCACHE_BASEDIR}
CCACHE_BASEDIR=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472
$ export CCACHE_MAXSIZE=512M
$ export CCACHE_DIR="${TMPDIR}/ccache"
$ mkdir -p ${CCACHE_DIR}
$ if [ -f ${CI_PROJECT_DIR}/ccache.tar ]; then
$ tar -C "${CCACHE_DIR}" -xf "${CI_PROJECT_DIR}/ccache.tar"
$ fi
$ ccache --zero-stats
Statistics zeroed
$ ccache --show-stats --verbose
Summary:
Cache directory: /nvme/bbpcihpcproj12/379213/ccache
Primary config: /nvme/bbpcihpcproj12/379213/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Apr 13 11:55:25 2022
Hits: 0 / 0
Direct: 0 / 0
Preprocessed: 0 / 0
Misses: 0
Direct: 0
Preprocessed: 0
Primary storage:
Hits: 0 / 0
Misses: 0
Cache size (GB): 0.30 / 0.51 (58.96 %)
Files: 10052
$ fi
$ module load unstable git ${SPACK_EXTRA_MODULES}
$ spack uninstall -y --dependents /${SPACK_INSTALLED_HASH} || true
==> Error: No installed spec matches the hash: '7eavmcfdqg53mknhnfbcd3xnco2pxk7o'
$ spack spec -Il ${SPACK_FULL_SPEC}
Input spec
--------------------------------
- coreneuron%nvhpc+gpu~legacy-unit+nmodl~openmp+sympy+tests build_type=RelWithDebInfo
- ^hpe-mpi%gcc
[+] ^nmodl@develop%[email protected]~ipo~legacy-unit~llvm~llvm_cuda~python build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]~i18n arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]+bz2+ctypes+dbm~debug+libxml2+lzma~nis~optimizations+pic+pyexpat+pythoncmd+readline+shared+sqlite3+ssl~tix~tkinter~ucs4+uuid+zlib patches=0d98e93189bc278fbc37a50ed7f183bd8aaf249a8e1670a465f0db6bb4f8cf87,4c2457325f2b608b1b6a2c63087df8c26e07db3e3d493caf36a56f0ecf6fb768,f2fd060afc4b4618fe8104c4c5d771f36dc55b1db5a4623785a4ea707ec72fb4 arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]+libyaml arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
Concretized
--------------------------------
- 7eavmcf coreneuron@develop%[email protected]~caliper~codegenopt+gpu~ipo~ispc~knl~legacy-unit+mpi+nmodl~openmp~profile+report+shared+sympy~sympyopt+tests~unified build_type=RelWithDebInfo arch=linux-rhel7-skylake
- 6s6wcfe ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] vn2t5vi ^[email protected]%[email protected]+atomic+chrono~clanglibcpp~container~context~coroutine+date_time~debug+exception~fiber+filesystem+graph~icu+iostreams+locale+log+math~mpi+multithreaded~numpy+pic+program_options~python+random+regex+serialization+shared+signals~singlethreaded+system~taggedlayout+test+thread+timer~versionedlayout+wave cxxstd=98 visibility=hidden arch=linux-rhel7-skylake
[^] 2qmvlfy ^[email protected]%[email protected]~doc+ncurses+openssl+ownlibs~qt build_type=Release arch=linux-rhel7-skylake
[^] gi5x2dn ^[email protected]%[email protected]~allow-unsupported-compilers~dev arch=linux-rhel7-skylake
- ajxdymq ^[email protected]%[email protected]+lex~nls arch=linux-rhel7-skylake
[^] hyunzkn ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] vutuzbn ^[email protected]%[email protected]~ipo+mpi build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] utj47fz ^[email protected]%[email protected]+cxx~fortran+hl~ipo~java+mpi+shared~szip~threadsafe+tools api=default build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ihxi5rl ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] evtnqzd ^[email protected]%[email protected]+optimize+pic+shared arch=linux-rhel7-skylake
[^] x72vveu ^[email protected]%[email protected]~ipo+shared build_type=RelWithDebInfo arch=linux-rhel7-skylake
- cp3ofsp ^[email protected]%[email protected] arch=linux-rhel7-skylake
[+] 7hnncpj ^nmodl@develop%[email protected]~ipo~legacy-unit~llvm~llvm_cuda~python build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] onq3mhg ^[email protected]%[email protected]~i18n arch=linux-rhel7-skylake
[^] 4ectow5 ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] 72xzp3v ^[email protected]%[email protected]+bz2+ctypes+dbm~debug+libxml2+lzma~nis~optimizations+pic+pyexpat+pythoncmd+readline+shared+sqlite3+ssl~tix~tkinter~ucs4+uuid+zlib patches=0d98e93189bc278fbc37a50ed7f183bd8aaf249a8e1670a465f0db6bb4f8cf87,4c2457325f2b608b1b6a2c63087df8c26e07db3e3d493caf36a56f0ecf6fb768,f2fd060afc4b4618fe8104c4c5d771f36dc55b1db5a4623785a4ea707ec72fb4 arch=linux-rhel7-skylake
[^] 22arfs4 ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ascbeii ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] kvw3vhm ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] 7iyiygo ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] c7qvw2q ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] mazoiox ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] y7rfzdj ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] v4z3s5e ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] w4gddqx ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ci5oe5b ^[email protected]%[email protected]+libyaml arch=linux-rhel7-skylake
[^] xj2wlac ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] xtett6q ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] dzb2mfs ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] 5fkun4i ^[email protected]%[email protected]~ipo~profile+shared~tests build_type=RelWithDebInfo arch=linux-rhel7-skylake
$ spack ${SPACK_INSTALL_EXTRA_FLAGS} install -j${SLURM_CPUS_PER_TASK} --log-format=junit --log-file=${CI_PROJECT_DIR}/install.xml --keep-stage ${SPACK_FULL_SPEC} || install_failed=1
==> [email protected] : has external module in ['bison/3.8.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/bison-3.8.2-ej2cew (external bison-3.8.2-6s6wcfeanp2mkdib4c3n3ivkcuosopgm)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/boost-1.78.0-hiorc7 (external boost-1.78.0-vn2t5viljvtb5wyfhgmx57txyeiznxo4)
==> [email protected] : has external module in ['cmake/3.21.4']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/cmake-3.21.4-cdyb7k (external cmake-3.21.4-2qmvlfyylrv3t5ankluyr5cqey2nlfzd)
==> [email protected] : has external module in ['cuda/11.6.1']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/cuda-11.6.1-ngetva (external cuda-11.6.1-gi5x2dn47i7nv76flbq7aatllxpaqmjp)
==> [email protected] : has external module in ['flex/2.6.3']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/flex-2.6.3-3mtl7j (external flex-2.6.3-ajxdymqhnzqgeuhvk5nu5zfymzq35n6i)
==> [email protected] : has external module in ['hpe-mpi/2.25.hmpt']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/hpe-mpi-2.25.hmpt-4ukyxt (external hpe-mpi-2.25.hmpt-hyunzknktdzri34bx26bupkvapehonql)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/pkgconf-1.8.0-ihxi5r
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/zlib-1.2.11-mswjss (external zlib-1.2.11-evtnqzdtuvprwnbd2nljimpzywrhw4uo)
==> [email protected] : has external module in ['ninja/1.10.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ninja-1.10.2-eh33rh (external ninja-1.10.2-cp3ofspgkdqaqih5vrhotmdwvkozfswp)
==> [email protected] : has external module in ['python/3.9.7']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/python-3.9.7-yj5alh (external python-3.9.7-72xzp3vgzh5b424qevkfruhs3ajzqbiy)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/libyaml-0.2.5-xj2wla
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/spdlog-1.9.2-x72vve
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/reportinglib-2.5.6-5fkun4
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/hdf5-1.10.7-utj47f
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pluggy-0.13.0-mazoio
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-toml-0.10.2-w4gddq
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-py-1.9.0-y7rfzd
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-iniconfig-1.1.1-kvw3vh
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-markupsafe-2.0.1-4ectow
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pyparsing-2.4.7-c7qvw2
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-mpmath-1.1.0-dzb2mf
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-setuptools-57.4.0-v4z3s5
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-attrs-21.2.0-ascbei
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pyyaml-5.3.1-ci5oe5
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/libsonata-report-1.1-vutuzb
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-jinja2-3.0.1-onq3mh
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-packaging-21.0-7iyiyg
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-sympy-1.8-xtett6
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pytest-6.2.4-22arfs
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/software/install_gcc-11.2.0-skylake/nmodl-develop-7hnncp
==> Installing coreneuron-develop-7eavmcfdqg53mknhnfbcd3xnco2pxk7o
==> No binary for coreneuron-develop-7eavmcfdqg53mknhnfbcd3xnco2pxk7o found: installing from source
==> Warning: Expected user 904556 to own /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472, but it is owned by 0
==> No patches needed for coreneuron
==> coreneuron: Executing phase: 'cmake'
==> coreneuron: Executing phase: 'build'
==> coreneuron: Executing phase: 'install'
==> coreneuron: Successfully installed coreneuron-develop-7eavmcfdqg53mknhnfbcd3xnco2pxk7o
Fetch: 4.03s. Build: 3m 49.87s. Total: 3m 53.89s.
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/software/install_nvhpc-22.3-skylake/coreneuron-develop-7eavmc
$ chmod -R g+rX "${SPACK_BUILD}"
$ if [[ ${install_failed} == 1 ]]; then exit 1; fi
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ ccache --cleanup
$ ccache --show-stats --verbose
Summary:
Cache directory: /nvme/bbpcihpcproj12/379213/ccache
Primary config: /nvme/bbpcihpcproj12/379213/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Apr 13 12:00:12 2022
Hits: 88 / 97 (90.72 %)
Direct: 79 / 97 (81.44 %)
Preprocessed: 9 / 18 (50.00 %)
Misses: 9
Direct: 18
Preprocessed: 9
Uncacheable: 16
Primary storage:
Hits: 167 / 194 (86.08 %)
Misses: 27
Cache size (GB): 0.30 / 0.51 (59.18 %)
Files: 10078
Uncacheable:
Called for linking: 13
No input file: 3
$ tar -C "${CCACHE_DIR}" -cf "${CI_PROJECT_DIR}/ccache.tar" .
$ fi
$ touch ${SPACK_STAGE_DIR}/spack-configure-args.txt
$ cp ${SPACK_STAGE_DIR}/spack-{build-env,build-out,configure-args}.txt ${CI_PROJECT_DIR}/
$ echo "SPACK_BUILD_DIR=${SPACK_BUILD_DIR}" > ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_FULL_SPEC=${SPACK_FULL_SPEC}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_SOURCE_DIR=${SPACK_SOURCE_DIR}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_INSTALLED_HASH=${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB=^/${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ num_failures=$(module load unstable python-dev; python -c "from lxml import etree; xml = etree.parse('${CI_PROJECT_DIR}/install.xml'); print(sum(1 for _ in xml.getroot().iter('failure')) + sum(1 for _ in xml.getroot().iter('error')))")
Autoloading python/3.9.7
Autoloading py-virtualenv/16.7.6
Autoloading py-scipy/1.7.1
Autoloading py-numpy/1.19.5
Autoloading py-mpi4py/3.1.2
Autoloading hpe-mpi/2.25.hmpt
Autoloading py-h5py/3.4.0
Autoloading hdf5/1.10.7
$ if [[ ${num_failures} > 0 ]]; then exit ${num_failures}; fi
section_end:1649844013:step_script section_start:1649844013:archive_cache Saving cache for successful job
Creating cache build:coreneuron:nmodl:nvhpc:acc-8...
Runtime platform  arch=amd64 os=linux pid=50682 revision=58ba2b95 version=14.2.0
ccache.tar: found 1 matching files and directories
Uploading cache.zip to https://bbpobjectstorage.epfl.ch/gitlab-runners-cache/project/112/build%3Acoreneuron%3Anmodl%3Anvhpc%3Aacc-8
Created cache
section_end:1649844028:archive_cache section_start:1649844028:upload_artifacts_on_success Uploading artifacts for successful job
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=51126 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
initial_environment.env: found 1 matching files and directories
spack-build-env.txt: found 1 matching files and directories
spack-build-out.txt: found 1 matching files and directories
spack-configure-args.txt: found 1 matching files and directories
Uploading artifacts as "archive" to coordinator... ok id=211022 responseStatus=201 Created token=T5xFNgyz
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=51154 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
Uploading artifacts as "junit" to coordinator... ok id=211022 responseStatus=201 Created token=T5xFNgyz
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=51195 revision=58ba2b95 version=14.2.0
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
Uploading artifacts as "dotenv" to coordinator... ok id=211022 responseStatus=201 Created token=T5xFNgyz
section_end:1649844029:upload_artifacts_on_success section_start:1649844029:cleanup_file_variables Cleaning up project directory and file based variables
section_end:1649844030:cleanup_file_variables Job succeeded
Running with gitlab-runner 14.3.2 (e0218c92)
 on BB5 map runner pnPo3yJy
section_start:1649843663:resolve_secrets Resolving secrets
section_end:1649843663:resolve_secrets section_start:1649843663:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 14.3.2, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor764470396, slurm job id
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 48472, build ref 9a466a6e2a1c807e4d6045f3fc0b1fbaf51370b6, job ID 211021
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472, optional exclusive flag , optional cpus per task flag --cpus-per-task=8, optional qos flag
A slurm job will be created with name GL_J211021_PROD_P112_CP3_C2
Job parameters: memory=76G, cpus_per_task=8, duration=1:00:00, constraint=cpu ntasks=2 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
sbatch: INFO: Job specifies cpu constraint, setting --constraint=[skl|clx]
Submitted batch job 379212
job state: PD
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=2 --cpus-per-task=8 --mem=76G --job-name=GL_J211021_PROD_P112_CP3_C2 -C cpu --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472 --ntasks=2 --jobid=379212 --cpus-per-task=8 --mem=76G
section_end:1649843666:prepare_executor section_start:1649843666:prepare_script Preparing environment
Running on r1i4n0 via bbpv1.epfl.ch...
section_end:1649843671:prepare_script section_start:1649843671:get_sources Getting source from Git repository
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1649843672:get_sources section_start:1649843672:restore_cache Restoring cache
Checking cache for build:coreneuron:nmodl:nvhpc:omp-8...
Runtime platform  arch=amd64 os=linux pid=36808 revision=58ba2b95 version=14.2.0
cache.zip is up to date 
Successfully extracted cache
section_end:1649843684:restore_cache section_start:1649843684:download_artifacts Downloading artifacts
Downloading artifacts for build:nmodl (211018)...
Runtime platform  arch=amd64 os=linux pid=37007 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=211018 responseStatus=200 OK token=8Sk4MWSK
section_end:1649843685:download_artifacts section_start:1649843685:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
$ SPACK_BUILD="${PWD}/spack-build"
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ export SPACK_USER_CONFIG_PATH=${PWD}/spack-config
$ mkdir ${SPACK_USER_CONFIG_PATH}
$ cat > ${SPACK_USER_CONFIG_PATH}/config.yaml << END_SCRIPT # collapsed multi-line command
$ spack config blame config
--- config:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211021/spack-config/config.yaml:2 build_stage:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211021/spack-config/config.yaml:3 - /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211021/spack-build
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211021/spack-config/config.yaml:4 source_cache: /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211021/spack-source-cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/config.yaml:2 ccache: True
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:2 install_tree:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:3 root: $user_cache_path/software
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:4 projections:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:5 all: install_{compiler.name}-{compiler.version}-{target}/{name}-{version}-{hash:6}
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:6 module_roots:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:7 tcl: $user_cache_path/modules
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:9 environments:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:10 root: $user_cache_path/environments
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:88 concretization: separately
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:12 build_jobs: 8
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:18 extensions:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:18 - $spack/bluebrain/spack-scripting
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:19 # This is the path to the root of the Spack install tree.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:19 # You can use $spack here to refer to the root of the spack instance.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:37 template_dirs:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:37 - $spack/share/spack/templates
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Temporary locations Spack can try to use for builds.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Recommended options are given below.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Builds can be faster in temporary directories on some (e.g., HPC) systems.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Specifying `$tempdir` will ensure use of the default temporary directory
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # (i.e., ``$TMP` or ``$TMPDIR``).
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Another option that prevents conflicts and potential permission issues is
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # to specify `$user_cache_path/stage`, which ensures each user builds in their
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # home directory.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # A more traditional path uses the value of `$spack/var/spack/stage`, which
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # builds directly inside Spack's instance without staging them in a
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # temporary space. Problems with specifying a path inside a Spack instance
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # are that it precludes its use as a system package and its ability to be
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # pip installable.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # In any case, if the username is not already in the path, Spack will append
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # the value of `$user` in an attempt to avoid potential conflicts between
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # users in shared temporary spaces.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # The build stage can be purged with `spack clean --stage` and
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # `spack clean -a`, so it is important that the specified directory uniquely
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # identifies Spack staging to avoid accidentally wiping out non-Spack work.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:73 test_stage: $user_cache_path/test
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:82 misc_cache: $user_cache_path/cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:94 connect_timeout: 10
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:99 verify_ssl: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:107 suppress_gpg_warnings: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:113 install_missing_compilers: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:118 checksum: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:123 deprecated: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:128 dirty: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:136 build_language: C
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:145 locks: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:150 url_fetch_method: urllib
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:176 concretizer: clingo
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:183 db_lock_timeout: 300
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:191 package_lock_timeout: null
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:197 shared_linking: rpath
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:202 allow_sgid: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:207 terminal_title: False
_builtin debug: False
$ export XDG_CONFIG_HOME=${CI_BUILDS_DIR}/J${CI_JOB_ID}_local_config
$ echo "Configuring git to use CI_JOB_TOKEN to access [email protected] (${XDG_CONFIG_HOME})"
Configuring git to use CI_JOB_TOKEN to access [email protected] (/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472/J211021_local_config)
$ mkdir -p "${XDG_CONFIG_HOME}/git"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = [email protected]:" > "${XDG_CONFIG_HOME}/git/config"
$ cat "${XDG_CONFIG_HOME}/git/config"
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = [email protected]:
$ SPACK_FULL_SPEC="${SPACK_PACKAGE}${SPACK_PACKAGE_COMPILER:+%}${SPACK_PACKAGE_COMPILER} ${SPACK_PACKAGE_SPEC} ${SPACK_PACKAGE_DEPENDENCIES} ${SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB}"
$ echo "Preparing to install ${SPACK_FULL_SPEC}"
Preparing to install coreneuron%nvhpc +nmodl+openmp+gpu+tests~legacy-unit~sympy build_type=RelWithDebInfo ^hpe-mpi%gcc ^/7hnncpjzlyqwesmjkszimpr4c2qteocr
$ JSON_SPEC=$(spack spec --json ${SPACK_FULL_SPEC})
$ SPACK_INSTALLED_HASH=$(module load unstable python; echo "${JSON_SPEC}" | python -c "import json, sys; print(json.loads(sys.stdin.read())[\"spec\"][\"nodes\"][0][\"hash\"])")
$ echo "Determined its hash will be ${SPACK_INSTALLED_HASH}"
Determined its hash will be dqfui2d7qu3mc6w6xhrktts7kg2pjeio
$ SPACK_STAGE_DIR=${SPACK_BUILD}/spack-stage-${SPACK_PACKAGE}-develop-${SPACK_INSTALLED_HASH}
$ SPACK_BUILD_DIR=${SPACK_STAGE_DIR}/spack-build-${SPACK_INSTALLED_HASH:0:7}
$ SPACK_SOURCE_DIR=${SPACK_STAGE_DIR}/spack-src
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ module load unstable ccache
$ export CCACHE_BASEDIR=$(realpath -P ${CI_BUILDS_DIR})
$ echo CCACHE_BASEDIR=${CCACHE_BASEDIR}
CCACHE_BASEDIR=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472
$ export CCACHE_MAXSIZE=512M
$ export CCACHE_DIR="${TMPDIR}/ccache"
$ mkdir -p ${CCACHE_DIR}
$ if [ -f ${CI_PROJECT_DIR}/ccache.tar ]; then
$ tar -C "${CCACHE_DIR}" -xf "${CI_PROJECT_DIR}/ccache.tar"
$ fi
$ ccache --zero-stats
Statistics zeroed
$ ccache --show-stats --verbose
Summary:
Cache directory: /nvme/bbpcihpcproj12/379212/ccache
Primary config: /nvme/bbpcihpcproj12/379212/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Apr 13 11:55:26 2022
Hits: 0 / 0
Direct: 0 / 0
Preprocessed: 0 / 0
Misses: 0
Direct: 0
Preprocessed: 0
Primary storage:
Hits: 0 / 0
Misses: 0
Cache size (GB): 0.32 / 0.51 (62.56 %)
Files: 10850
$ fi
$ module load unstable git ${SPACK_EXTRA_MODULES}
$ spack uninstall -y --dependents /${SPACK_INSTALLED_HASH} || true
==> Error: No installed spec matches the hash: 'dqfui2d7qu3mc6w6xhrktts7kg2pjeio'
$ spack spec -Il ${SPACK_FULL_SPEC}
Input spec
--------------------------------
- coreneuron%nvhpc+gpu~legacy-unit+nmodl+openmp~sympy+tests build_type=RelWithDebInfo
- ^hpe-mpi%gcc
[+] ^nmodl@develop%[email protected]~ipo~legacy-unit~llvm~llvm_cuda~python build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]~i18n arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]+bz2+ctypes+dbm~debug+libxml2+lzma~nis~optimizations+pic+pyexpat+pythoncmd+readline+shared+sqlite3+ssl~tix~tkinter~ucs4+uuid+zlib patches=0d98e93189bc278fbc37a50ed7f183bd8aaf249a8e1670a465f0db6bb4f8cf87,4c2457325f2b608b1b6a2c63087df8c26e07db3e3d493caf36a56f0ecf6fb768,f2fd060afc4b4618fe8104c4c5d771f36dc55b1db5a4623785a4ea707ec72fb4 arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]+libyaml arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
Concretized
--------------------------------
- dqfui2d coreneuron@develop%[email protected]~caliper~codegenopt+gpu~ipo~ispc~knl~legacy-unit+mpi+nmodl+openmp~profile+report+shared~sympy~sympyopt+tests~unified build_type=RelWithDebInfo arch=linux-rhel7-skylake
- 6s6wcfe ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] vn2t5vi ^[email protected]%[email protected]+atomic+chrono~clanglibcpp~container~context~coroutine+date_time~debug+exception~fiber+filesystem+graph~icu+iostreams+locale+log+math~mpi+multithreaded~numpy+pic+program_options~python+random+regex+serialization+shared+signals~singlethreaded+system~taggedlayout+test+thread+timer~versionedlayout+wave cxxstd=98 visibility=hidden arch=linux-rhel7-skylake
[^] 2qmvlfy ^[email protected]%[email protected]~doc+ncurses+openssl+ownlibs~qt build_type=Release arch=linux-rhel7-skylake
[^] gi5x2dn ^[email protected]%[email protected]~allow-unsupported-compilers~dev arch=linux-rhel7-skylake
- ajxdymq ^[email protected]%[email protected]+lex~nls arch=linux-rhel7-skylake
[^] hyunzkn ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] vutuzbn ^[email protected]%[email protected]~ipo+mpi build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] utj47fz ^[email protected]%[email protected]+cxx~fortran+hl~ipo~java+mpi+shared~szip~threadsafe+tools api=default build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ihxi5rl ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] evtnqzd ^[email protected]%[email protected]+optimize+pic+shared arch=linux-rhel7-skylake
[^] x72vveu ^[email protected]%[email protected]~ipo+shared build_type=RelWithDebInfo arch=linux-rhel7-skylake
- cp3ofsp ^[email protected]%[email protected] arch=linux-rhel7-skylake
[+] 7hnncpj ^nmodl@develop%[email protected]~ipo~legacy-unit~llvm~llvm_cuda~python build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] onq3mhg ^[email protected]%[email protected]~i18n arch=linux-rhel7-skylake
[^] 4ectow5 ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] 72xzp3v ^[email protected]%[email protected]+bz2+ctypes+dbm~debug+libxml2+lzma~nis~optimizations+pic+pyexpat+pythoncmd+readline+shared+sqlite3+ssl~tix~tkinter~ucs4+uuid+zlib patches=0d98e93189bc278fbc37a50ed7f183bd8aaf249a8e1670a465f0db6bb4f8cf87,4c2457325f2b608b1b6a2c63087df8c26e07db3e3d493caf36a56f0ecf6fb768,f2fd060afc4b4618fe8104c4c5d771f36dc55b1db5a4623785a4ea707ec72fb4 arch=linux-rhel7-skylake
[^] 22arfs4 ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ascbeii ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] kvw3vhm ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] 7iyiygo ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] c7qvw2q ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] mazoiox ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] y7rfzdj ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] v4z3s5e ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] w4gddqx ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ci5oe5b ^[email protected]%[email protected]+libyaml arch=linux-rhel7-skylake
[^] xj2wlac ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] xtett6q ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] dzb2mfs ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] 5fkun4i ^[email protected]%[email protected]~ipo~profile+shared~tests build_type=RelWithDebInfo arch=linux-rhel7-skylake
$ spack ${SPACK_INSTALL_EXTRA_FLAGS} install -j${SLURM_CPUS_PER_TASK} --log-format=junit --log-file=${CI_PROJECT_DIR}/install.xml --keep-stage ${SPACK_FULL_SPEC} || install_failed=1
==> [email protected] : has external module in ['bison/3.8.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/bison-3.8.2-ej2cew (external bison-3.8.2-6s6wcfeanp2mkdib4c3n3ivkcuosopgm)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/boost-1.78.0-hiorc7 (external boost-1.78.0-vn2t5viljvtb5wyfhgmx57txyeiznxo4)
==> [email protected] : has external module in ['cmake/3.21.4']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/cmake-3.21.4-cdyb7k (external cmake-3.21.4-2qmvlfyylrv3t5ankluyr5cqey2nlfzd)
==> [email protected] : has external module in ['cuda/11.6.1']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/cuda-11.6.1-ngetva (external cuda-11.6.1-gi5x2dn47i7nv76flbq7aatllxpaqmjp)
==> [email protected] : has external module in ['flex/2.6.3']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/flex-2.6.3-3mtl7j (external flex-2.6.3-ajxdymqhnzqgeuhvk5nu5zfymzq35n6i)
==> [email protected] : has external module in ['hpe-mpi/2.25.hmpt']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/hpe-mpi-2.25.hmpt-4ukyxt (external hpe-mpi-2.25.hmpt-hyunzknktdzri34bx26bupkvapehonql)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/pkgconf-1.8.0-ihxi5r
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/zlib-1.2.11-mswjss (external zlib-1.2.11-evtnqzdtuvprwnbd2nljimpzywrhw4uo)
==> [email protected] : has external module in ['ninja/1.10.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ninja-1.10.2-eh33rh (external ninja-1.10.2-cp3ofspgkdqaqih5vrhotmdwvkozfswp)
==> [email protected] : has external module in ['python/3.9.7']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/python-3.9.7-yj5alh (external python-3.9.7-72xzp3vgzh5b424qevkfruhs3ajzqbiy)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/libyaml-0.2.5-xj2wla
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/spdlog-1.9.2-x72vve
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/reportinglib-2.5.6-5fkun4
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/hdf5-1.10.7-utj47f
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-mpmath-1.1.0-dzb2mf
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-iniconfig-1.1.1-kvw3vh
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-attrs-21.2.0-ascbei
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-toml-0.10.2-w4gddq
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pyparsing-2.4.7-c7qvw2
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-setuptools-57.4.0-v4z3s5
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pluggy-0.13.0-mazoio
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-markupsafe-2.0.1-4ectow
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-py-1.9.0-y7rfzd
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pyyaml-5.3.1-ci5oe5
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/libsonata-report-1.1-vutuzb
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-sympy-1.8-xtett6
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-packaging-21.0-7iyiyg
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-jinja2-3.0.1-onq3mh
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pytest-6.2.4-22arfs
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/software/install_gcc-11.2.0-skylake/nmodl-develop-7hnncp
==> Installing coreneuron-develop-dqfui2d7qu3mc6w6xhrktts7kg2pjeio
==> No binary for coreneuron-develop-dqfui2d7qu3mc6w6xhrktts7kg2pjeio found: installing from source
==> Warning: Expected user 904556 to own /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472, but it is owned by 0
==> No patches needed for coreneuron
==> coreneuron: Executing phase: 'cmake'
==> coreneuron: Executing phase: 'build'
==> coreneuron: Executing phase: 'install'
==> coreneuron: Successfully installed coreneuron-develop-dqfui2d7qu3mc6w6xhrktts7kg2pjeio
Fetch: 4.56s. Build: 3m 48.79s. Total: 3m 53.35s.
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/software/install_nvhpc-22.3-skylake/coreneuron-develop-dqfui2
$ chmod -R g+rX "${SPACK_BUILD}"
$ if [[ ${install_failed} == 1 ]]; then exit 1; fi
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ ccache --cleanup
$ ccache --show-stats --verbose
Summary:
Cache directory: /nvme/bbpcihpcproj12/379212/ccache
Primary config: /nvme/bbpcihpcproj12/379212/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Apr 13 12:00:11 2022
Hits: 88 / 98 (89.80 %)
Direct: 79 / 98 (80.61 %)
Preprocessed: 9 / 19 (47.37 %)
Misses: 10
Direct: 19
Preprocessed: 10
Uncacheable: 17
Primary storage:
Hits: 167 / 196 (85.20 %)
Misses: 29
Cache size (GB): 0.32 / 0.51 (62.79 %)
Files: 10878
Uncacheable:
Called for linking: 14
No input file: 3
$ tar -C "${CCACHE_DIR}" -cf "${CI_PROJECT_DIR}/ccache.tar" .
$ fi
$ touch ${SPACK_STAGE_DIR}/spack-configure-args.txt
$ cp ${SPACK_STAGE_DIR}/spack-{build-env,build-out,configure-args}.txt ${CI_PROJECT_DIR}/
$ echo "SPACK_BUILD_DIR=${SPACK_BUILD_DIR}" > ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_FULL_SPEC=${SPACK_FULL_SPEC}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_SOURCE_DIR=${SPACK_SOURCE_DIR}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_INSTALLED_HASH=${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB=^/${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ num_failures=$(module load unstable python-dev; python -c "from lxml import etree; xml = etree.parse('${CI_PROJECT_DIR}/install.xml'); print(sum(1 for _ in xml.getroot().iter('failure')) + sum(1 for _ in xml.getroot().iter('error')))")
Autoloading python/3.9.7
Autoloading py-virtualenv/16.7.6
Autoloading py-scipy/1.7.1
Autoloading py-numpy/1.19.5
Autoloading py-mpi4py/3.1.2
Autoloading hpe-mpi/2.25.hmpt
Autoloading py-h5py/3.4.0
Autoloading hdf5/1.10.7
$ if [[ ${num_failures} > 0 ]]; then exit ${num_failures}; fi
section_end:1649844012:step_script section_start:1649844012:archive_cache Saving cache for successful job
Creating cache build:coreneuron:nmodl:nvhpc:omp-8...
Runtime platform  arch=amd64 os=linux pid=50589 revision=58ba2b95 version=14.2.0
ccache.tar: found 1 matching files and directories
Uploading cache.zip to https://bbpobjectstorage.epfl.ch/gitlab-runners-cache/project/112/build%3Acoreneuron%3Anmodl%3Anvhpc%3Aomp-8
Created cache
section_end:1649844030:archive_cache section_start:1649844030:upload_artifacts_on_success Uploading artifacts for successful job
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=51315 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
initial_environment.env: found 1 matching files and directories
spack-build-env.txt: found 1 matching files and directories
spack-build-out.txt: found 1 matching files and directories
spack-configure-args.txt: found 1 matching files and directories
Uploading artifacts as "archive" to coordinator... ok id=211021 responseStatus=201 Created token=Koj5cr-4
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=51365 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
Uploading artifacts as "junit" to coordinator... ok id=211021 responseStatus=201 Created token=Koj5cr-4
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=51408 revision=58ba2b95 version=14.2.0
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
Uploading artifacts as "dotenv" to coordinator... ok id=211021 responseStatus=201 Created token=Koj5cr-4
section_end:1649844031:upload_artifacts_on_success section_start:1649844031:cleanup_file_variables Cleaning up project directory and file based variables
section_end:1649844032:cleanup_file_variables Job succeeded
Running with gitlab-runner 14.3.2 (e0218c92)
 on BB5 map runner pnPo3yJy
section_start:1649843540:resolve_secrets Resolving secrets
section_end:1649843540:resolve_secrets section_start:1649843540:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 14.3.2, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor166864381, slurm job id
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 48472, build ref 9a466a6e2a1c807e4d6045f3fc0b1fbaf51370b6, job ID 211028
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472, optional exclusive flag , optional cpus per task flag --cpus-per-task=8, optional qos flag
A slurm job will be created with name GL_J211028_PROD_P112_CP5_C13
Job parameters: memory=76G, cpus_per_task=8, duration=2:00:00, constraint=cpu ntasks=2 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
sbatch: INFO: Job specifies cpu constraint, setting --constraint=[skl|clx]
Submitted batch job 379163
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=2 --cpus-per-task=8 --mem=76G --job-name=GL_J211028_PROD_P112_CP5_C13 -C cpu --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472 --time=2:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472 --ntasks=2 --jobid=379163 --cpus-per-task=8 --mem=76G
section_end:1649843542:prepare_executor section_start:1649843542:prepare_script Preparing environment
Running on r1i7n21 via bbpv1.epfl.ch...
section_end:1649843544:prepare_script section_start:1649843544:get_sources Getting source from Git repository
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1649843545:get_sources section_start:1649843545:restore_cache Restoring cache
Checking cache for build:neuron:mod2c:intel-8...
Runtime platform  arch=amd64 os=linux pid=7906 revision=58ba2b95 version=14.2.0
cache.zip is up to date 
Successfully extracted cache
section_end:1649843551:restore_cache section_start:1649843551:download_artifacts Downloading artifacts
Downloading artifacts for build:coreneuron:mod2c:intel (211023)...
Runtime platform  arch=amd64 os=linux pid=8568 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=211023 responseStatus=200 OK token=SAThh_5K
section_end:1649843552:download_artifacts section_start:1649843552:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
$ SPACK_BUILD="${PWD}/spack-build"
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ export SPACK_USER_CONFIG_PATH=${PWD}/spack-config
$ mkdir ${SPACK_USER_CONFIG_PATH}
$ cat > ${SPACK_USER_CONFIG_PATH}/config.yaml << END_SCRIPT # collapsed multi-line command
$ spack config blame config
--- config:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211028/spack-config/config.yaml:2 build_stage:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211028/spack-config/config.yaml:3 - /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211028/spack-build
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211028/spack-config/config.yaml:4 source_cache: /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211028/spack-source-cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/config.yaml:2 ccache: True
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:2 install_tree:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:3 root: $user_cache_path/software
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:4 projections:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:5 all: install_{compiler.name}-{compiler.version}-{target}/{name}-{version}-{hash:6}
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:6 module_roots:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:7 tcl: $user_cache_path/modules
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:9 environments:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:10 root: $user_cache_path/environments
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:88 concretization: separately
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:12 build_jobs: 8
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:18 extensions:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:18 - $spack/bluebrain/spack-scripting
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:19 # This is the path to the root of the Spack install tree.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:19 # You can use $spack here to refer to the root of the spack instance.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:37 template_dirs:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:37 - $spack/share/spack/templates
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Temporary locations Spack can try to use for builds.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Recommended options are given below.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Builds can be faster in temporary directories on some (e.g., HPC) systems.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Specifying `$tempdir` will ensure use of the default temporary directory
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # (i.e., ``$TMP` or ``$TMPDIR``).
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Another option that prevents conflicts and potential permission issues is
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # to specify `$user_cache_path/stage`, which ensures each user builds in their
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # home directory.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # A more traditional path uses the value of `$spack/var/spack/stage`, which
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # builds directly inside Spack's instance without staging them in a
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # temporary space. Problems with specifying a path inside a Spack instance
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # are that it precludes its use as a system package and its ability to be
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # pip installable.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # In any case, if the username is not already in the path, Spack will append
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # the value of `$user` in an attempt to avoid potential conflicts between
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # users in shared temporary spaces.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # The build stage can be purged with `spack clean --stage` and
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # `spack clean -a`, so it is important that the specified directory uniquely
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # identifies Spack staging to avoid accidentally wiping out non-Spack work.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:73 test_stage: $user_cache_path/test
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:82 misc_cache: $user_cache_path/cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:94 connect_timeout: 10
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:99 verify_ssl: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:107 suppress_gpg_warnings: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:113 install_missing_compilers: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:118 checksum: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:123 deprecated: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:128 dirty: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:136 build_language: C
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:145 locks: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:150 url_fetch_method: urllib
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:176 concretizer: clingo
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:183 db_lock_timeout: 300
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:191 package_lock_timeout: null
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:197 shared_linking: rpath
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:202 allow_sgid: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:207 terminal_title: False
_builtin debug: False
$ export XDG_CONFIG_HOME=${CI_BUILDS_DIR}/J${CI_JOB_ID}_local_config
$ echo "Configuring git to use CI_JOB_TOKEN to access [email protected] (${XDG_CONFIG_HOME})"
Configuring git to use CI_JOB_TOKEN to access [email protected] (/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472/J211028_local_config)
$ mkdir -p "${XDG_CONFIG_HOME}/git"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = [email protected]:" > "${XDG_CONFIG_HOME}/git/config"
$ cat "${XDG_CONFIG_HOME}/git/config"
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = [email protected]:
$ SPACK_FULL_SPEC="${SPACK_PACKAGE}${SPACK_PACKAGE_COMPILER:+%}${SPACK_PACKAGE_COMPILER} ${SPACK_PACKAGE_SPEC} ${SPACK_PACKAGE_DEPENDENCIES} ${SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB}"
$ echo "Preparing to install ${SPACK_FULL_SPEC}"
Preparing to install neuron%intel +coreneuron+debug+tests~legacy-unit~rx3d model_tests=channel-benchmark,olfactory,tqperf-heavy ^/hsyhen5wbojr4dsx5rf32wndj3dpm5nb
$ JSON_SPEC=$(spack spec --json ${SPACK_FULL_SPEC})
$ SPACK_INSTALLED_HASH=$(module load unstable python; echo "${JSON_SPEC}" | python -c "import json, sys; print(json.loads(sys.stdin.read())[\"spec\"][\"nodes\"][0][\"hash\"])")
$ echo "Determined its hash will be ${SPACK_INSTALLED_HASH}"
Determined its hash will be 47kwkdqbfahzhd6mgo3d7lvidjenlcwo
$ SPACK_STAGE_DIR=${SPACK_BUILD}/spack-stage-${SPACK_PACKAGE}-develop-${SPACK_INSTALLED_HASH}
$ SPACK_BUILD_DIR=${SPACK_STAGE_DIR}/spack-build-${SPACK_INSTALLED_HASH:0:7}
$ SPACK_SOURCE_DIR=${SPACK_STAGE_DIR}/spack-src
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ module load unstable ccache
$ export CCACHE_BASEDIR=$(realpath -P ${CI_BUILDS_DIR})
$ echo CCACHE_BASEDIR=${CCACHE_BASEDIR}
CCACHE_BASEDIR=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472
$ export CCACHE_MAXSIZE=512M
$ export CCACHE_DIR="${TMPDIR}/ccache"
$ mkdir -p ${CCACHE_DIR}
$ if [ -f ${CI_PROJECT_DIR}/ccache.tar ]; then
$ tar -C "${CCACHE_DIR}" -xf "${CI_PROJECT_DIR}/ccache.tar"
$ fi
$ ccache --zero-stats
Statistics zeroed
$ ccache --show-stats --verbose
Summary:
Cache directory: /nvme/bbpcihpcproj12/379163/ccache
Primary config: /nvme/bbpcihpcproj12/379163/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Apr 13 11:52:56 2022
Hits: 0 / 0
Direct: 0 / 0
Preprocessed: 0 / 0
Misses: 0
Direct: 0
Preprocessed: 0
Primary storage:
Hits: 0 / 0
Misses: 0
Cache size (GB): 0.46 / 0.51 (90.56 %)
Files: 28368
$ fi
$ module load unstable git ${SPACK_EXTRA_MODULES}
$ spack uninstall -y --dependents /${SPACK_INSTALLED_HASH} || true
==> Error: No installed spec matches the hash: '47kwkdqbfahzhd6mgo3d7lvidjenlcwo'
$ spack spec -Il ${SPACK_FULL_SPEC}
Input spec
--------------------------------
- neuron%intel+coreneuron+debug~legacy-unit~rx3d+tests model_tests=channel-benchmark,olfactory,tqperf-heavy
[+] ^coreneuron@develop%[email protected]~caliper~codegenopt~gpu~ipo~ispc~knl~legacy-unit+mpi~nmodl~openmp~profile+report+shared~sympy~sympyopt+tests~unified build_type=Debug arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]+atomic+chrono~clanglibcpp~container~context~coroutine+date_time~debug+exception~fiber+filesystem+graph~icu+iostreams+locale+log+math~mpi+multithreaded~numpy+pic+program_options~python+random+regex+serialization+shared+signals~singlethreaded+system~taggedlayout+test+thread+timer~versionedlayout+wave cxxstd=98 visibility=hidden arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]~ipo+mpi build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]+cxx~fortran+hl~ipo~java+mpi+shared~szip~threadsafe+tools api=default build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]+optimize+pic+shared arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]~ipo+shared build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]+bz2+ctypes+dbm~debug+libxml2+lzma~nis~optimizations+pic+pyexpat+pythoncmd+readline+shared+sqlite3+ssl~tix~tkinter~ucs4+uuid+zlib patches=0d98e93189bc278fbc37a50ed7f183bd8aaf249a8e1670a465f0db6bb4f8cf87,4c2457325f2b608b1b6a2c63087df8c26e07db3e3d493caf36a56f0ecf6fb768,f2fd060afc4b4618fe8104c4c5d771f36dc55b1db5a4623785a4ea707ec72fb4 arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]~ipo~profile+shared~tests build_type=RelWithDebInfo arch=linux-rhel7-skylake
Concretized
--------------------------------
- 47kwkdq neuron@develop%[email protected]+binary~caliper+cmake~codechecks+coreneuron~cross-compile+debug~interviews~ipo+legacy-fr~legacy-unit+memacs+mod-compatibility+mpi+multisend~profile+pysetup+python~rx3d+shared+tests build_type=RelWithDebInfo model_tests=channel-benchmark,olfactory,tqperf-heavy patches=708cb04826b394a858069d93e8c08e1e81e914c23e1ef3da0486e8233834ff6c arch=linux-rhel7-skylake
- dwdch6b ^[email protected]%[email protected] arch=linux-rhel7-skylake
- z3q5f3x ^[email protected]%[email protected]~doc+ncurses+openssl+ownlibs~qt build_type=Release arch=linux-rhel7-skylake
[+] hsyhen5 ^coreneuron@develop%[email protected]~caliper~codegenopt~gpu~ipo~ispc~knl~legacy-unit+mpi~nmodl~openmp~profile+report+shared~sympy~sympyopt+tests~unified build_type=Debug arch=linux-rhel7-skylake
[^] dl7pfht ^[email protected]%[email protected]+atomic+chrono~clanglibcpp~container~context~coroutine+date_time~debug+exception~fiber+filesystem+graph~icu+iostreams+locale+log+math~mpi+multithreaded~numpy+pic+program_options~python+random+regex+serialization+shared+signals~singlethreaded+system~taggedlayout+test+thread+timer~versionedlayout+wave cxxstd=98 visibility=hidden arch=linux-rhel7-skylake
[^] 3narjkw ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] nfrzrlo ^[email protected]%[email protected]~ipo+mpi build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ofveh3j ^[email protected]%[email protected]+cxx~fortran+hl~ipo~java+mpi+shared~szip~threadsafe+tools api=default build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] xys6npz ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] e7w5nez ^[email protected]%[email protected]+optimize+pic+shared arch=linux-rhel7-skylake
[^] wk3uenv ^[email protected]%[email protected]~ipo+shared build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] f27a7nn ^[email protected]%[email protected]+bz2+ctypes+dbm~debug+libxml2+lzma~nis~optimizations+pic+pyexpat+pythoncmd+readline+shared+sqlite3+ssl~tix~tkinter~ucs4+uuid+zlib patches=0d98e93189bc278fbc37a50ed7f183bd8aaf249a8e1670a465f0db6bb4f8cf87,4c2457325f2b608b1b6a2c63087df8c26e07db3e3d493caf36a56f0ecf6fb768,f2fd060afc4b4618fe8104c4c5d771f36dc55b1db5a4623785a4ea707ec72fb4 arch=linux-rhel7-skylake
[^] gdhqypa ^[email protected]%[email protected]~ipo~profile+shared~tests build_type=RelWithDebInfo arch=linux-rhel7-skylake
- s4ueg72 ^[email protected]%[email protected]+lex~nls arch=linux-rhel7-skylake
[^] bjxwlfq ^[email protected]%[email protected]+bzip2+curses+git~libunistring+libxml2+tar+xz arch=linux-rhel7-skylake
[^] dvyfbbf ^[email protected]%[email protected]~debug~pic+shared arch=linux-rhel7-skylake
[^] ka2cfu4 ^[email protected]%[email protected] libs=shared,static arch=linux-rhel7-skylake
[^] w3ob7m2 ^[email protected]%[email protected]~python arch=linux-rhel7-skylake
[^] ahg6hq4 ^[email protected]%[email protected]~pic libs=shared,static arch=linux-rhel7-skylake
[^] q3rwuez ^[email protected]%[email protected]~symlinks+termlib abi=none arch=linux-rhel7-skylake
[^] 2fli66n ^[email protected]%[email protected] patches=08921fcbd732050c74ddf1de7d8ad95ffdbc09f8b4342456fa2f6a0dd02a957c,125cd6142fac2cc339e9aebfe79e40f90766022b8e8401532b1729e84fc148c2,5c314db58d005043bf407abaf25eb9823b9032a22fd12a0b142d4bf548130fa4,d428578be7fb99b831eb61e53b8d88a859afe08b479a21238180899707d79ce4 arch=linux-rhel7-skylake
- 6ggc5yr ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] qj3vvxk ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] 5flpilw ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ipsbwzp ^[email protected]%[email protected]+blas+lapack patches=8a9d5d1b3f145c043b8b04869e7d46c6ff95c3f486d84f69693017c7e6190c7d arch=linux-rhel7-skylake
[^] rl5ojfn ^[email protected]%[email protected]~ilp64+shared threads=none arch=linux-rhel7-skylake
[^] taxuisw ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] tzmlo5v ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] d5szgli ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] huky5p4 ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] wli7caj ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] 3a3ralt ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] rh6zvev ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] kpwupa3 ^[email protected]%[email protected]+toml arch=linux-rhel7-skylake
[^] apu667d ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] jesa65u ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] skrpmyr ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ktthpzx ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] tzl5sq5 ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] zkd7o6d ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] 5abs2t7 ^[email protected]%[email protected] arch=linux-rhel7-skylake
$ spack ${SPACK_INSTALL_EXTRA_FLAGS} install -j${SLURM_CPUS_PER_TASK} --log-format=junit --log-file=${CI_PROJECT_DIR}/install.xml --keep-stage ${SPACK_FULL_SPEC} || install_failed=1
==> [email protected] : has external module in ['bison/3.8.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/bison-3.8.2-ej2cew (external bison-3.8.2-dwdch6bmdeclr2novthsywtrryotawwz)
==> [email protected] : has external module in ['cmake/3.21.4']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/cmake-3.21.4-cdyb7k (external cmake-3.21.4-z3q5f3xwpuibd3qbgdscqmu3efarbu42)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/boost-1.78.0-hiorc7 (external boost-1.78.0-dl7pfhtsvyaq3273gr2g3l5vr37eeydc)
==> [email protected] : has external module in ['hpe-mpi/2.25.hmpt']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/hpe-mpi-2.25.hmpt-4ukyxt (external hpe-mpi-2.25.hmpt-3narjkwjh6i2jr3zl3g5wdjlqi52hkwh)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/pkgconf-1.8.0-xys6np
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/zlib-1.2.11-mswjss (external zlib-1.2.11-e7w5nezpwf572epfhbosqfzboztysout)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/spdlog-1.9.2-wk3uen
==> [email protected] : has external module in ['python/3.9.7']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/python-3.9.7-yj5alh (external python-3.9.7-f27a7nnhppeztxijpispnaedcfo2vjxi)
==> [email protected] : has external module in ['flex/2.6.3']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/flex-2.6.3-3mtl7j (external flex-2.6.3-s4ueg72j7l6vkdyvfxj2tweo7v7s3otx)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/bzip2-1.0.8-eu2d6e (external bzip2-1.0.8-dvyfbbfdan33i7wyydyeskb3legztopm)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/libiconv-1.16-ka2cfu
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/xz-5.2.5-ahg6hq
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ncurses-6.2-nsve5x (external ncurses-6.2-q3rwuezhc2j56kexunypjpejzketozmc)
[+] /usr (external tar-1.28-2fli66nkkv35g34exwd2xifp5w3gtgo6)
==> [email protected] : has external module in ['ninja/1.10.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ninja-1.10.2-eh33rh (external ninja-1.10.2-6ggc5yre7qddwxdjmn7sfptpdoiy4dtp)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/intel-mkl-2020.4.304-rzr3hj (external intel-mkl-2020.4.304-rl5ojfngtc7sffwyz7v6ek6eceq53wed)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/readline-8.1-we2frg (external readline-8.1-5abs2t7f6dtgbejv6kjzd4gjwhdlr7qr)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/reportinglib-2.5.6-gdhqyp
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/hdf5-1.10.7-ofveh3
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/py-setuptools-57.4.0-5flpil
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/libxml2-2.9.12-w3ob7m
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/libsonata-report-1.1-nfrzrl
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/py-cython-0.29.24-taxuis
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/py-toml-0.10.2-ktthpz
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/py-pyparsing-2.4.7-3a3ral
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/py-coverage-5.5-zkd7o6
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/py-pip-21.1.2-jesa65
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/py-attrs-21.2.0-d5szgl
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/py-iniconfig-1.1.1-huky5p
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/py-mpi4py-3.1.2-qj3vvx
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/gettext-0.21-bjxwlf
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/software/install_intel-2021.4.0-skylake/coreneuron-develop-hsyhen
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/py-numpy-1.19.5-ipsbwz
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/py-packaging-21.0-wli7ca
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/py-tomli-1.2.1-apu667
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/py-setuptools-scm-6.3.2-kpwupa
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/py-py-1.9.0-skrpmy
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/py-pluggy-0.13.0-rh6zve
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/py-pytest-6.2.4-tzmlo5
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/py-pytest-cov-2.8.1-tzl5sq
==> Installing neuron-develop-47kwkdqbfahzhd6mgo3d7lvidjenlcwo
==> No binary for neuron-develop-47kwkdqbfahzhd6mgo3d7lvidjenlcwo found: installing from source
==> Warning: Expected user 904556 to own /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472, but it is owned by 0
==> Applied patch /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/bluebrain/repo-patches/packages/neuron/revert_Import3d_numerical_format.master.patch
==> neuron: Executing phase: 'cmake'
==> neuron: Executing phase: 'build'
==> neuron: Executing phase: 'install'
==> neuron: Successfully installed neuron-develop-47kwkdqbfahzhd6mgo3d7lvidjenlcwo
Fetch: 15.07s. Build: 8m 0.06s. Total: 8m 15.12s.
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/software/install_intel-2021.4.0-skylake/neuron-develop-47kwkd
$ chmod -R g+rX "${SPACK_BUILD}"
$ if [[ ${install_failed} == 1 ]]; then exit 1; fi
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ ccache --cleanup
$ ccache --show-stats --verbose
Summary:
Cache directory: /nvme/bbpcihpcproj12/379163/ccache
Primary config: /nvme/bbpcihpcproj12/379163/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Apr 13 12:01:58 2022
Hits: 718 / 816 (87.99 %)
Direct: 339 / 825 (41.09 %)
Preprocessed: 379 / 480 (78.96 %)
Misses: 98
Direct: 486
Preprocessed: 101
Uncacheable: 130
Primary storage:
Hits: 1361 / 1644 (82.79 %)
Misses: 283
Cache size (GB): 0.46 / 0.51 (90.80 %)
Files: 28560
Uncacheable:
Autoconf compile/link: 7
Called for linking: 107
Called for preprocessing: 1
Compilation failed: 3
No input file: 6
Preprocessing failed: 6
$ tar -C "${CCACHE_DIR}" -cf "${CI_PROJECT_DIR}/ccache.tar" .
$ fi
$ touch ${SPACK_STAGE_DIR}/spack-configure-args.txt
$ cp ${SPACK_STAGE_DIR}/spack-{build-env,build-out,configure-args}.txt ${CI_PROJECT_DIR}/
$ echo "SPACK_BUILD_DIR=${SPACK_BUILD_DIR}" > ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_FULL_SPEC=${SPACK_FULL_SPEC}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_SOURCE_DIR=${SPACK_SOURCE_DIR}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_INSTALLED_HASH=${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB=^/${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ num_failures=$(module load unstable python-dev; python -c "from lxml import etree; xml = etree.parse('${CI_PROJECT_DIR}/install.xml'); print(sum(1 for _ in xml.getroot().iter('failure')) + sum(1 for _ in xml.getroot().iter('error')))")
Autoloading python/3.9.7
Autoloading py-virtualenv/16.7.6
Autoloading py-scipy/1.7.1
Autoloading py-numpy/1.19.5
Autoloading py-mpi4py/3.1.2
Autoloading hpe-mpi/2.25.hmpt
Autoloading py-h5py/3.4.0
Autoloading hdf5/1.10.7
$ if [[ ${num_failures} > 0 ]]; then exit ${num_failures}; fi
section_end:1649844128:step_script section_start:1649844128:archive_cache Saving cache for successful job
Creating cache build:neuron:mod2c:intel-8...
Runtime platform  arch=amd64 os=linux pid=31814 revision=58ba2b95 version=14.2.0
ccache.tar: found 1 matching files and directories
Uploading cache.zip to https://bbpobjectstorage.epfl.ch/gitlab-runners-cache/project/112/build%3Aneuron%3Amod2c%3Aintel-8
Created cache
section_end:1649844155:archive_cache section_start:1649844155:upload_artifacts_on_success Uploading artifacts for successful job
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=32243 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
initial_environment.env: found 1 matching files and directories
spack-build-env.txt: found 1 matching files and directories
spack-build-out.txt: found 1 matching files and directories
spack-configure-args.txt: found 1 matching files and directories
Uploading artifacts as "archive" to coordinator... ok id=211028 responseStatus=201 Created token=Y6xPs83u
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=32286 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
Uploading artifacts as "junit" to coordinator... ok id=211028 responseStatus=201 Created token=Y6xPs83u
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=32327 revision=58ba2b95 version=14.2.0
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
Uploading artifacts as "dotenv" to coordinator... ok id=211028 responseStatus=201 Created token=Y6xPs83u
section_end:1649844157:upload_artifacts_on_success section_start:1649844157:cleanup_file_variables Cleaning up project directory and file based variables
section_end:1649844158:cleanup_file_variables Job succeeded
Running with gitlab-runner 14.3.2 (e0218c92)
 on BB5 map runner pnPo3yJy
section_start:1649843622:resolve_secrets Resolving secrets
section_end:1649843622:resolve_secrets section_start:1649843622:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 14.3.2, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor791176765, slurm job id
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 48472, build ref 9a466a6e2a1c807e4d6045f3fc0b1fbaf51370b6, job ID 211025
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472, optional exclusive flag , optional cpus per task flag --cpus-per-task=8, optional qos flag
A slurm job will be created with name GL_J211025_PROD_P112_CP6_C5
Job parameters: memory=76G, cpus_per_task=8, duration=2:00:00, constraint=cpu ntasks=2 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
sbatch: INFO: Job specifies cpu constraint, setting --constraint=[skl|clx]
Submitted batch job 379200
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=2 --cpus-per-task=8 --mem=76G --job-name=GL_J211025_PROD_P112_CP6_C5 -C cpu --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472 --time=2:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472 --ntasks=2 --jobid=379200 --cpus-per-task=8 --mem=76G
section_end:1649843623:prepare_executor section_start:1649843623:prepare_script Preparing environment
Running on r1i7n21 via bbpv1.epfl.ch...
section_end:1649843625:prepare_script section_start:1649843625:get_sources Getting source from Git repository
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1649843626:get_sources section_start:1649843626:restore_cache Restoring cache
Checking cache for build:neuron:mod2c:nvhpc:acc-8...
Runtime platform  arch=amd64 os=linux pid=11363 revision=58ba2b95 version=14.2.0
cache.zip is up to date 
Successfully extracted cache
section_end:1649843632:restore_cache section_start:1649843632:download_artifacts Downloading artifacts
Downloading artifacts for build:coreneuron:mod2c:nvhpc:acc (211019)...
Runtime platform  arch=amd64 os=linux pid=11520 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=211019 responseStatus=200 OK token=Qw7KHaR5
section_end:1649843633:download_artifacts section_start:1649843633:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
$ SPACK_BUILD="${PWD}/spack-build"
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ export SPACK_USER_CONFIG_PATH=${PWD}/spack-config
$ mkdir ${SPACK_USER_CONFIG_PATH}
$ cat > ${SPACK_USER_CONFIG_PATH}/config.yaml << END_SCRIPT # collapsed multi-line command
$ spack config blame config
--- config:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211025/spack-config/config.yaml:2 build_stage:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211025/spack-config/config.yaml:3 - /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211025/spack-build
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211025/spack-config/config.yaml:4 source_cache: /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211025/spack-source-cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/config.yaml:2 ccache: True
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:2 install_tree:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:3 root: $user_cache_path/software
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:4 projections:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:5 all: install_{compiler.name}-{compiler.version}-{target}/{name}-{version}-{hash:6}
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:6 module_roots:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:7 tcl: $user_cache_path/modules
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:9 environments:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:10 root: $user_cache_path/environments
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:88 concretization: separately
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:12 build_jobs: 8
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:18 extensions:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:18 - $spack/bluebrain/spack-scripting
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:19 # This is the path to the root of the Spack install tree.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:19 # You can use $spack here to refer to the root of the spack instance.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:37 template_dirs:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:37 - $spack/share/spack/templates
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Temporary locations Spack can try to use for builds.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Recommended options are given below.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Builds can be faster in temporary directories on some (e.g., HPC) systems.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Specifying `$tempdir` will ensure use of the default temporary directory
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # (i.e., ``$TMP` or ``$TMPDIR``).
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Another option that prevents conflicts and potential permission issues is
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # to specify `$user_cache_path/stage`, which ensures each user builds in their
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # home directory.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # A more traditional path uses the value of `$spack/var/spack/stage`, which
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # builds directly inside Spack's instance without staging them in a
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # temporary space. Problems with specifying a path inside a Spack instance
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # are that it precludes its use as a system package and its ability to be
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # pip installable.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # In any case, if the username is not already in the path, Spack will append
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # the value of `$user` in an attempt to avoid potential conflicts between
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # users in shared temporary spaces.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # The build stage can be purged with `spack clean --stage` and
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # `spack clean -a`, so it is important that the specified directory uniquely
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # identifies Spack staging to avoid accidentally wiping out non-Spack work.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:73 test_stage: $user_cache_path/test
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:82 misc_cache: $user_cache_path/cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:94 connect_timeout: 10
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:99 verify_ssl: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:107 suppress_gpg_warnings: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:113 install_missing_compilers: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:118 checksum: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:123 deprecated: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:128 dirty: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:136 build_language: C
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:145 locks: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:150 url_fetch_method: urllib
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:176 concretizer: clingo
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:183 db_lock_timeout: 300
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:191 package_lock_timeout: null
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:197 shared_linking: rpath
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:202 allow_sgid: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:207 terminal_title: False
_builtin debug: False
$ export XDG_CONFIG_HOME=${CI_BUILDS_DIR}/J${CI_JOB_ID}_local_config
$ echo "Configuring git to use CI_JOB_TOKEN to access [email protected] (${XDG_CONFIG_HOME})"
Configuring git to use CI_JOB_TOKEN to access [email protected] (/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472/J211025_local_config)
$ mkdir -p "${XDG_CONFIG_HOME}/git"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = [email protected]:" > "${XDG_CONFIG_HOME}/git/config"
$ cat "${XDG_CONFIG_HOME}/git/config"
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = [email protected]:
$ SPACK_FULL_SPEC="${SPACK_PACKAGE}${SPACK_PACKAGE_COMPILER:+%}${SPACK_PACKAGE_COMPILER} ${SPACK_PACKAGE_SPEC} ${SPACK_PACKAGE_DEPENDENCIES} ${SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB}"
$ echo "Preparing to install ${SPACK_FULL_SPEC}"
Preparing to install neuron%nvhpc +coreneuron+debug+tests~legacy-unit~rx3d model_tests=channel-benchmark,olfactory,tqperf-heavy ^/myaj5gmcll4sywf66372ze5qfubwzr3s
$ JSON_SPEC=$(spack spec --json ${SPACK_FULL_SPEC})
$ SPACK_INSTALLED_HASH=$(module load unstable python; echo "${JSON_SPEC}" | python -c "import json, sys; print(json.loads(sys.stdin.read())[\"spec\"][\"nodes\"][0][\"hash\"])")
$ echo "Determined its hash will be ${SPACK_INSTALLED_HASH}"
Determined its hash will be sqrws2vsdssc3xwshyyrvkm5oyeurofv
$ SPACK_STAGE_DIR=${SPACK_BUILD}/spack-stage-${SPACK_PACKAGE}-develop-${SPACK_INSTALLED_HASH}
$ SPACK_BUILD_DIR=${SPACK_STAGE_DIR}/spack-build-${SPACK_INSTALLED_HASH:0:7}
$ SPACK_SOURCE_DIR=${SPACK_STAGE_DIR}/spack-src
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ module load unstable ccache
$ export CCACHE_BASEDIR=$(realpath -P ${CI_BUILDS_DIR})
$ echo CCACHE_BASEDIR=${CCACHE_BASEDIR}
CCACHE_BASEDIR=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472
$ export CCACHE_MAXSIZE=512M
$ export CCACHE_DIR="${TMPDIR}/ccache"
$ mkdir -p ${CCACHE_DIR}
$ if [ -f ${CI_PROJECT_DIR}/ccache.tar ]; then
$ tar -C "${CCACHE_DIR}" -xf "${CI_PROJECT_DIR}/ccache.tar"
$ fi
$ ccache --zero-stats
Statistics zeroed
$ ccache --show-stats --verbose
Summary:
Cache directory: /nvme/bbpcihpcproj12/379200/ccache
Primary config: /nvme/bbpcihpcproj12/379200/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Apr 13 11:54:20 2022
Hits: 0 / 0
Direct: 0 / 0
Preprocessed: 0 / 0
Misses: 0
Direct: 0
Preprocessed: 0
Primary storage:
Hits: 0 / 0
Misses: 0
Cache size (GB): 0.46 / 0.51 (90.81 %)
Files: 30335
$ fi
$ module load unstable git ${SPACK_EXTRA_MODULES}
$ spack uninstall -y --dependents /${SPACK_INSTALLED_HASH} || true
==> Error: No installed spec matches the hash: 'sqrws2vsdssc3xwshyyrvkm5oyeurofv'
$ spack spec -Il ${SPACK_FULL_SPEC}
Input spec
--------------------------------
- neuron%nvhpc+coreneuron+debug~legacy-unit~rx3d+tests model_tests=channel-benchmark,olfactory,tqperf-heavy
[+] ^coreneuron@develop%[email protected]~caliper~codegenopt+gpu~ipo~ispc~knl~legacy-unit+mpi~nmodl+openmp~profile+report+shared~sympy~sympyopt+tests~unified build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]+atomic+chrono~clanglibcpp~container~context~coroutine+date_time~debug+exception~fiber+filesystem+graph~icu+iostreams+locale+log+math~mpi+multithreaded~numpy+pic+program_options~python+random+regex+serialization+shared+signals~singlethreaded+system~taggedlayout+test+thread+timer~versionedlayout+wave cxxstd=98 visibility=hidden arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]~allow-unsupported-compilers~dev arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]~ipo+mpi build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]+cxx~fortran+hl~ipo~java+mpi+shared~szip~threadsafe+tools api=default build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]+optimize+pic+shared arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]~ipo+shared build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]+bz2+ctypes+dbm~debug+libxml2+lzma~nis~optimizations+pic+pyexpat+pythoncmd+readline+shared+sqlite3+ssl~tix~tkinter~ucs4+uuid+zlib patches=0d98e93189bc278fbc37a50ed7f183bd8aaf249a8e1670a465f0db6bb4f8cf87,4c2457325f2b608b1b6a2c63087df8c26e07db3e3d493caf36a56f0ecf6fb768,f2fd060afc4b4618fe8104c4c5d771f36dc55b1db5a4623785a4ea707ec72fb4 arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]~ipo~profile+shared~tests build_type=RelWithDebInfo arch=linux-rhel7-skylake
Concretized
--------------------------------
- sqrws2v neuron@develop%[email protected]+binary~caliper+cmake~codechecks+coreneuron~cross-compile+debug~interviews~ipo+legacy-fr~legacy-unit+memacs+mod-compatibility+mpi+multisend~profile+pysetup+python~rx3d+shared+tests build_type=RelWithDebInfo model_tests=channel-benchmark,olfactory,tqperf-heavy patches=708cb04826b394a858069d93e8c08e1e81e914c23e1ef3da0486e8233834ff6c arch=linux-rhel7-skylake
- 6s6wcfe ^[email protected]%[email protected] arch=linux-rhel7-skylake
- ucwiakr ^[email protected]%[email protected]~doc+ncurses+openssl+ownlibs~qt build_type=Release arch=linux-rhel7-skylake
[+] myaj5gm ^coreneuron@develop%[email protected]~caliper~codegenopt+gpu~ipo~ispc~knl~legacy-unit+mpi~nmodl+openmp~profile+report+shared~sympy~sympyopt+tests~unified build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] vn2t5vi ^[email protected]%[email protected]+atomic+chrono~clanglibcpp~container~context~coroutine+date_time~debug+exception~fiber+filesystem+graph~icu+iostreams+locale+log+math~mpi+multithreaded~numpy+pic+program_options~python+random+regex+serialization+shared+signals~singlethreaded+system~taggedlayout+test+thread+timer~versionedlayout+wave cxxstd=98 visibility=hidden arch=linux-rhel7-skylake
[^] gi5x2dn ^[email protected]%[email protected]~allow-unsupported-compilers~dev arch=linux-rhel7-skylake
[^] pjmdwuu ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] vtyvxbi ^[email protected]%[email protected]~ipo+mpi build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] tbfoeg7 ^[email protected]%[email protected]+cxx~fortran+hl~ipo~java+mpi+shared~szip~threadsafe+tools api=default build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] 6dfyugs ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] h7lotu6 ^[email protected]%[email protected]+optimize+pic+shared arch=linux-rhel7-skylake
[^] snd2rt6 ^[email protected]%[email protected]~ipo+shared build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] f62u3oh ^[email protected]%[email protected]+bz2+ctypes+dbm~debug+libxml2+lzma~nis~optimizations+pic+pyexpat+pythoncmd+readline+shared+sqlite3+ssl~tix~tkinter~ucs4+uuid+zlib patches=0d98e93189bc278fbc37a50ed7f183bd8aaf249a8e1670a465f0db6bb4f8cf87,4c2457325f2b608b1b6a2c63087df8c26e07db3e3d493caf36a56f0ecf6fb768,f2fd060afc4b4618fe8104c4c5d771f36dc55b1db5a4623785a4ea707ec72fb4 arch=linux-rhel7-skylake
[^] bdkvweu ^[email protected]%[email protected]~ipo~profile+shared~tests build_type=RelWithDebInfo arch=linux-rhel7-skylake
- ajxdymq ^[email protected]%[email protected]+lex~nls arch=linux-rhel7-skylake
[^] lswrnpk ^[email protected]%[email protected]+bzip2+curses+git~libunistring+libxml2+tar+xz patches=6e530daaae14725d578d6cafbf7d523accc9ed29fd817bd421cf98a5f51e9e1b,bbe9f0539aa504966ac104224d25a9260faa1015ed3adda936467be9c7de4eae,fb27a3fb5e414bdc50ffebbbe2da986473df70a493caa4396226f51a67c55424 arch=linux-rhel7-skylake
[^] h3k3sur ^[email protected]%[email protected]~debug~pic+shared arch=linux-rhel7-skylake
[^] f4hjvxq ^[email protected]%[email protected] libs=shared,static arch=linux-rhel7-skylake
[^] i3pi6vv ^[email protected]%[email protected]~python patches=05ff238cf435825ef835c7ae39376b52dc83d8caf19e962f0766c841386a305a,10a88ad47f9797cf7cf2d7d07241f665a3b6d1f31fa026728c8c2ae93e1664e9 arch=linux-rhel7-skylake
[^] cqwvvjz ^[email protected]%[email protected]~pic libs=shared,static arch=linux-rhel7-skylake
[^] jpe3but ^[email protected]%[email protected]~symlinks+termlib abi=none arch=linux-rhel7-skylake
[^] zmpra4b ^[email protected]%[email protected] patches=08921fcbd732050c74ddf1de7d8ad95ffdbc09f8b4342456fa2f6a0dd02a957c,125cd6142fac2cc339e9aebfe79e40f90766022b8e8401532b1729e84fc148c2,5c314db58d005043bf407abaf25eb9823b9032a22fd12a0b142d4bf548130fa4,d428578be7fb99b831eb61e53b8d88a859afe08b479a21238180899707d79ce4 arch=linux-rhel7-skylake
- cp3ofsp ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] v2mlx7s ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] l4z6bjq ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] kohlctm ^[email protected]%[email protected]+blas+lapack patches=cf407c1024b0878c4222dd352aa9dece412073bb15b138243a2893725434c7b6 arch=linux-rhel7-skylake
[^] r5cvgru ^[email protected]%[email protected]~ilp64+shared threads=none arch=linux-rhel7-skylake
[^] dzkzhqs ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] iopgs7j ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] prphkb7 ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] iwg6vas ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] o7j2crb ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] uecslka ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] fjgegze ^[email protected]%[email protected]+toml arch=linux-rhel7-skylake
[^] xsvykjn ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] w6kat5w ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] qhehf4e ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] yg6s277 ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] l6udtg7 ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] lfjn2o4 ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] 3333cwm ^[email protected]%[email protected] arch=linux-rhel7-skylake
$ spack ${SPACK_INSTALL_EXTRA_FLAGS} install -j${SLURM_CPUS_PER_TASK} --log-format=junit --log-file=${CI_PROJECT_DIR}/install.xml --keep-stage ${SPACK_FULL_SPEC} || install_failed=1
==> [email protected] : has external module in ['bison/3.8.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/bison-3.8.2-ej2cew (external bison-3.8.2-6s6wcfeanp2mkdib4c3n3ivkcuosopgm)
==> [email protected] : has external module in ['cmake/3.21.4']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/cmake-3.21.4-cdyb7k (external cmake-3.21.4-ucwiakreeghdgbo22kbqqhgnnlwxqtnn)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/boost-1.78.0-hiorc7 (external boost-1.78.0-vn2t5viljvtb5wyfhgmx57txyeiznxo4)
==> [email protected] : has external module in ['cuda/11.6.1']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/cuda-11.6.1-ngetva (external cuda-11.6.1-gi5x2dn47i7nv76flbq7aatllxpaqmjp)
==> [email protected] : has external module in ['hpe-mpi/2.25.hmpt']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/hpe-mpi-2.25.hmpt-4ukyxt (external hpe-mpi-2.25.hmpt-pjmdwuu36ioiwqyrg6lrj7nrq3waqjj2)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/pkgconf-1.8.0-6dfyug
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/zlib-1.2.11-mswjss (external zlib-1.2.11-h7lotu6jszcvumwucry2y5jnhvxw5x2d)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/spdlog-1.9.2-snd2rt
==> [email protected] : has external module in ['python/3.9.7']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/python-3.9.7-yj5alh (external python-3.9.7-f62u3ohchswt5q2b63chawohzqrl6wvy)
==> [email protected] : has external module in ['flex/2.6.3']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/flex-2.6.3-3mtl7j (external flex-2.6.3-ajxdymqhnzqgeuhvk5nu5zfymzq35n6i)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/bzip2-1.0.8-eu2d6e (external bzip2-1.0.8-h3k3surwc24macvlxbcme35uztk6dqiw)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/libiconv-1.16-f4hjvx
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/xz-5.2.5-cqwvvj
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ncurses-6.2-nsve5x (external ncurses-6.2-jpe3butfegqspi6dnm7cyikpjt5nphfo)
[+] /usr (external tar-1.28-zmpra4bla6bcjnayn6eze4rnd5u4txfm)
==> [email protected] : has external module in ['ninja/1.10.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ninja-1.10.2-eh33rh (external ninja-1.10.2-cp3ofspgkdqaqih5vrhotmdwvkozfswp)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/intel-mkl-2020.4.304-rzr3hj (external intel-mkl-2020.4.304-r5cvgrurdnt6y267arsmmmgqi75ouxd2)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/readline-8.1-we2frg (external readline-8.1-3333cwmykvahrsdydir4qeyasic3liq6)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/reportinglib-2.5.6-bdkvwe
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/hdf5-1.10.7-tbfoeg
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/py-setuptools-57.4.0-l4z6bj
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/libxml2-2.9.12-i3pi6v
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/libsonata-report-1.1-vtyvxb
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/py-mpi4py-3.1.2-v2mlx7
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/py-pip-21.1.2-w6kat5
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/py-toml-0.10.2-yg6s27
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/py-iniconfig-1.1.1-prphkb
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-numpy-1.17.5-kohlct
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/py-pyparsing-2.4.7-o7j2cr
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/py-coverage-5.5-lfjn2o
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/py-attrs-21.2.0-iopgs7
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/gettext-0.21-lswrnp
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/software/install_nvhpc-22.3-skylake/coreneuron-develop-myaj5g
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/py-tomli-1.2.1-xsvykj
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/py-packaging-21.0-iwg6va
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/py-setuptools-scm-6.3.2-fjgegz
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/py-py-1.9.0-qhehf4
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/py-pluggy-0.13.0-uecslk
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/py-pytest-6.2.4-dzkzhq
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/py-pytest-cov-2.8.1-l6udtg
==> Installing neuron-develop-sqrws2vsdssc3xwshyyrvkm5oyeurofv
==> No binary for neuron-develop-sqrws2vsdssc3xwshyyrvkm5oyeurofv found: installing from source
==> Warning: Expected user 904556 to own /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472, but it is owned by 0
==> Applied patch /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/bluebrain/repo-patches/packages/neuron/revert_Import3d_numerical_format.master.patch
==> neuron: Executing phase: 'cmake'
==> neuron: Executing phase: 'build'
==> neuron: Executing phase: 'install'
==> neuron: Successfully installed neuron-develop-sqrws2vsdssc3xwshyyrvkm5oyeurofv
Fetch: 19.36s. Build: 17m 22.18s. Total: 17m 41.54s.
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/software/install_nvhpc-22.3-skylake/neuron-develop-sqrws2
$ chmod -R g+rX "${SPACK_BUILD}"
$ if [[ ${install_failed} == 1 ]]; then exit 1; fi
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ ccache --cleanup
$ ccache --show-stats --verbose
Summary:
Cache directory: /nvme/bbpcihpcproj12/379200/ccache
Primary config: /nvme/bbpcihpcproj12/379200/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Apr 13 12:12:52 2022
Hits: 718 / 815 (88.10 %)
Direct: 336 / 824 (40.78 %)
Preprocessed: 382 / 482 (79.25 %)
Misses: 97
Direct: 488
Preprocessed: 100
Uncacheable: 122
Primary storage:
Hits: 1360 / 1642 (82.83 %)
Misses: 282
Cache size (GB): 0.47 / 0.51 (91.04 %)
Files: 30525
Uncacheable:
Autoconf compile/link: 7
Called for linking: 99
Called for preprocessing: 1
Compilation failed: 3
No input file: 6
Preprocessing failed: 6
$ tar -C "${CCACHE_DIR}" -cf "${CI_PROJECT_DIR}/ccache.tar" .
$ fi
$ touch ${SPACK_STAGE_DIR}/spack-configure-args.txt
$ cp ${SPACK_STAGE_DIR}/spack-{build-env,build-out,configure-args}.txt ${CI_PROJECT_DIR}/
$ echo "SPACK_BUILD_DIR=${SPACK_BUILD_DIR}" > ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_FULL_SPEC=${SPACK_FULL_SPEC}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_SOURCE_DIR=${SPACK_SOURCE_DIR}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_INSTALLED_HASH=${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB=^/${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ num_failures=$(module load unstable python-dev; python -c "from lxml import etree; xml = etree.parse('${CI_PROJECT_DIR}/install.xml'); print(sum(1 for _ in xml.getroot().iter('failure')) + sum(1 for _ in xml.getroot().iter('error')))")
Autoloading python/3.9.7
Autoloading py-virtualenv/16.7.6
Autoloading py-scipy/1.7.1
Autoloading py-numpy/1.19.5
Autoloading py-mpi4py/3.1.2
Autoloading hpe-mpi/2.25.hmpt
Autoloading py-h5py/3.4.0
Autoloading hdf5/1.10.7
$ if [[ ${num_failures} > 0 ]]; then exit ${num_failures}; fi
section_end:1649844780:step_script section_start:1649844780:archive_cache Saving cache for successful job
Creating cache build:neuron:mod2c:nvhpc:acc-8...
Runtime platform  arch=amd64 os=linux pid=52775 revision=58ba2b95 version=14.2.0
ccache.tar: found 1 matching files and directories
Uploading cache.zip to https://bbpobjectstorage.epfl.ch/gitlab-runners-cache/project/112/build%3Aneuron%3Amod2c%3Anvhpc%3Aacc-8
Created cache
section_end:1649844810:archive_cache section_start:1649844810:upload_artifacts_on_success Uploading artifacts for successful job
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=53397 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
initial_environment.env: found 1 matching files and directories
spack-build-env.txt: found 1 matching files and directories
spack-build-out.txt: found 1 matching files and directories
spack-configure-args.txt: found 1 matching files and directories
Uploading artifacts as "archive" to coordinator... ok id=211025 responseStatus=201 Created token=xznJFiB7
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=53441 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
Uploading artifacts as "junit" to coordinator... ok id=211025 responseStatus=201 Created token=xznJFiB7
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=53481 revision=58ba2b95 version=14.2.0
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
Uploading artifacts as "dotenv" to coordinator... ok id=211025 responseStatus=201 Created token=xznJFiB7
section_end:1649844811:upload_artifacts_on_success section_start:1649844811:cleanup_file_variables Cleaning up project directory and file based variables
section_end:1649844812:cleanup_file_variables Job succeeded
Running with gitlab-runner 14.3.2 (e0218c92)
 on BB5 map runner pnPo3yJy
section_start:1649843982:resolve_secrets Resolving secrets
section_end:1649843982:resolve_secrets section_start:1649843982:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 14.3.2, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor953715686, slurm job id
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 48472, build ref 9a466a6e2a1c807e4d6045f3fc0b1fbaf51370b6, job ID 211029
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472, optional exclusive flag , optional cpus per task flag --cpus-per-task=8, optional qos flag
A slurm job will be created with name GL_J211029_PROD_P112_CP2_C8
Job parameters: memory=76G, cpus_per_task=8, duration=2:00:00, constraint=cpu ntasks=2 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
sbatch: INFO: Job specifies cpu constraint, setting --constraint=[skl|clx]
Submitted batch job 379274
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=2 --cpus-per-task=8 --mem=76G --job-name=GL_J211029_PROD_P112_CP2_C8 -C cpu --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472 --time=2:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472 --ntasks=2 --jobid=379274 --cpus-per-task=8 --mem=76G
section_end:1649843983:prepare_executor section_start:1649843983:prepare_script Preparing environment
Running on r1i4n0 via bbpv1.epfl.ch...
section_end:1649843985:prepare_script section_start:1649843985:get_sources Getting source from Git repository
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1649843986:get_sources section_start:1649843986:restore_cache Restoring cache
Checking cache for build:neuron:nmodl:intel-8...
Runtime platform  arch=amd64 os=linux pid=48646 revision=58ba2b95 version=14.2.0
cache.zip is up to date 
Successfully extracted cache
section_end:1649843993:restore_cache section_start:1649843993:download_artifacts Downloading artifacts
Downloading artifacts for build:coreneuron:nmodl:intel (211024)...
Runtime platform  arch=amd64 os=linux pid=49016 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=211024 responseStatus=200 OK token=ym6v9xs-
section_end:1649843994:download_artifacts section_start:1649843994:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
$ SPACK_BUILD="${PWD}/spack-build"
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ export SPACK_USER_CONFIG_PATH=${PWD}/spack-config
$ mkdir ${SPACK_USER_CONFIG_PATH}
$ cat > ${SPACK_USER_CONFIG_PATH}/config.yaml << END_SCRIPT # collapsed multi-line command
$ spack config blame config
--- config:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211029/spack-config/config.yaml:2 build_stage:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211029/spack-config/config.yaml:3 - /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211029/spack-build
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211029/spack-config/config.yaml:4 source_cache: /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211029/spack-source-cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/config.yaml:2 ccache: True
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:2 install_tree:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:3 root: $user_cache_path/software
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:4 projections:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:5 all: install_{compiler.name}-{compiler.version}-{target}/{name}-{version}-{hash:6}
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:6 module_roots:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:7 tcl: $user_cache_path/modules
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:9 environments:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:10 root: $user_cache_path/environments
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:88 concretization: separately
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:12 build_jobs: 8
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:18 extensions:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:18 - $spack/bluebrain/spack-scripting
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:19 # This is the path to the root of the Spack install tree.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:19 # You can use $spack here to refer to the root of the spack instance.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:37 template_dirs:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:37 - $spack/share/spack/templates
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Temporary locations Spack can try to use for builds.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Recommended options are given below.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Builds can be faster in temporary directories on some (e.g., HPC) systems.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Specifying `$tempdir` will ensure use of the default temporary directory
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # (i.e., ``$TMP` or ``$TMPDIR``).
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Another option that prevents conflicts and potential permission issues is
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # to specify `$user_cache_path/stage`, which ensures each user builds in their
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # home directory.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # A more traditional path uses the value of `$spack/var/spack/stage`, which
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # builds directly inside Spack's instance without staging them in a
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # temporary space. Problems with specifying a path inside a Spack instance
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # are that it precludes its use as a system package and its ability to be
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # pip installable.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # In any case, if the username is not already in the path, Spack will append
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # the value of `$user` in an attempt to avoid potential conflicts between
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # users in shared temporary spaces.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # The build stage can be purged with `spack clean --stage` and
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # `spack clean -a`, so it is important that the specified directory uniquely
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # identifies Spack staging to avoid accidentally wiping out non-Spack work.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:73 test_stage: $user_cache_path/test
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:82 misc_cache: $user_cache_path/cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:94 connect_timeout: 10
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:99 verify_ssl: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:107 suppress_gpg_warnings: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:113 install_missing_compilers: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:118 checksum: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:123 deprecated: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:128 dirty: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:136 build_language: C
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:145 locks: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:150 url_fetch_method: urllib
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:176 concretizer: clingo
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:183 db_lock_timeout: 300
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:191 package_lock_timeout: null
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:197 shared_linking: rpath
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:202 allow_sgid: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:207 terminal_title: False
_builtin debug: False
$ export XDG_CONFIG_HOME=${CI_BUILDS_DIR}/J${CI_JOB_ID}_local_config
$ echo "Configuring git to use CI_JOB_TOKEN to access [email protected] (${XDG_CONFIG_HOME})"
Configuring git to use CI_JOB_TOKEN to access [email protected] (/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472/J211029_local_config)
$ mkdir -p "${XDG_CONFIG_HOME}/git"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = [email protected]:" > "${XDG_CONFIG_HOME}/git/config"
$ cat "${XDG_CONFIG_HOME}/git/config"
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = [email protected]:
$ SPACK_FULL_SPEC="${SPACK_PACKAGE}${SPACK_PACKAGE_COMPILER:+%}${SPACK_PACKAGE_COMPILER} ${SPACK_PACKAGE_SPEC} ${SPACK_PACKAGE_DEPENDENCIES} ${SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB}"
$ echo "Preparing to install ${SPACK_FULL_SPEC}"
Preparing to install neuron%intel +coreneuron+debug+tests~legacy-unit~rx3d model_tests=channel-benchmark,olfactory,tqperf-heavy ^/x2m7t35nswt5wf62dii5jjfu45snneyv
$ JSON_SPEC=$(spack spec --json ${SPACK_FULL_SPEC})
$ SPACK_INSTALLED_HASH=$(module load unstable python; echo "${JSON_SPEC}" | python -c "import json, sys; print(json.loads(sys.stdin.read())[\"spec\"][\"nodes\"][0][\"hash\"])")
$ echo "Determined its hash will be ${SPACK_INSTALLED_HASH}"
Determined its hash will be 4ljxz6sigfw2sqg4xtopvwa3qqcjfsk4
$ SPACK_STAGE_DIR=${SPACK_BUILD}/spack-stage-${SPACK_PACKAGE}-develop-${SPACK_INSTALLED_HASH}
$ SPACK_BUILD_DIR=${SPACK_STAGE_DIR}/spack-build-${SPACK_INSTALLED_HASH:0:7}
$ SPACK_SOURCE_DIR=${SPACK_STAGE_DIR}/spack-src
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ module load unstable ccache
$ export CCACHE_BASEDIR=$(realpath -P ${CI_BUILDS_DIR})
$ echo CCACHE_BASEDIR=${CCACHE_BASEDIR}
CCACHE_BASEDIR=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472
$ export CCACHE_MAXSIZE=512M
$ export CCACHE_DIR="${TMPDIR}/ccache"
$ mkdir -p ${CCACHE_DIR}
$ if [ -f ${CI_PROJECT_DIR}/ccache.tar ]; then
$ tar -C "${CCACHE_DIR}" -xf "${CI_PROJECT_DIR}/ccache.tar"
$ fi
$ ccache --zero-stats
Statistics zeroed
$ ccache --show-stats --verbose
Summary:
Cache directory: /nvme/bbpcihpcproj12/379274/ccache
Primary config: /nvme/bbpcihpcproj12/379274/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Apr 13 12:00:26 2022
Hits: 0 / 0
Direct: 0 / 0
Preprocessed: 0 / 0
Misses: 0
Direct: 0
Preprocessed: 0
Primary storage:
Hits: 0 / 0
Misses: 0
Cache size (GB): 0.47 / 0.51 (91.65 %)
Files: 26529
$ fi
$ module load unstable git ${SPACK_EXTRA_MODULES}
$ spack uninstall -y --dependents /${SPACK_INSTALLED_HASH} || true
==> Error: No installed spec matches the hash: '4ljxz6sigfw2sqg4xtopvwa3qqcjfsk4'
$ spack spec -Il ${SPACK_FULL_SPEC}
Input spec
--------------------------------
- neuron%intel+coreneuron+debug~legacy-unit~rx3d+tests model_tests=channel-benchmark,olfactory,tqperf-heavy
[+] ^coreneuron@develop%[email protected]~caliper~codegenopt~gpu~ipo~ispc~knl~legacy-unit+mpi+nmodl~openmp~profile+report+shared~sympy~sympyopt+tests~unified build_type=Debug arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]+atomic+chrono~clanglibcpp~container~context~coroutine+date_time~debug+exception~fiber+filesystem+graph~icu+iostreams+locale+log+math~mpi+multithreaded~numpy+pic+program_options~python+random+regex+serialization+shared+signals~singlethreaded+system~taggedlayout+test+thread+timer~versionedlayout+wave cxxstd=98 visibility=hidden arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]~ipo+mpi build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]+cxx~fortran+hl~ipo~java+mpi+shared~szip~threadsafe+tools api=default build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]+optimize+pic+shared arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]~ipo+shared build_type=RelWithDebInfo arch=linux-rhel7-skylake
[+] ^nmodl@develop%[email protected]~ipo~legacy-unit~llvm~llvm_cuda~python build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]~i18n arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]+bz2+ctypes+dbm~debug+libxml2+lzma~nis~optimizations+pic+pyexpat+pythoncmd+readline+shared+sqlite3+ssl~tix~tkinter~ucs4+uuid+zlib patches=0d98e93189bc278fbc37a50ed7f183bd8aaf249a8e1670a465f0db6bb4f8cf87,4c2457325f2b608b1b6a2c63087df8c26e07db3e3d493caf36a56f0ecf6fb768,f2fd060afc4b4618fe8104c4c5d771f36dc55b1db5a4623785a4ea707ec72fb4 arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]+libyaml arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]~ipo~profile+shared~tests build_type=RelWithDebInfo arch=linux-rhel7-skylake
Concretized
--------------------------------
- 4ljxz6s neuron@develop%[email protected]+binary~caliper+cmake~codechecks+coreneuron~cross-compile+debug~interviews~ipo+legacy-fr~legacy-unit+memacs+mod-compatibility+mpi+multisend~profile+pysetup+python~rx3d+shared+tests build_type=RelWithDebInfo model_tests=channel-benchmark,olfactory,tqperf-heavy patches=708cb04826b394a858069d93e8c08e1e81e914c23e1ef3da0486e8233834ff6c arch=linux-rhel7-skylake
- dwdch6b ^[email protected]%[email protected] arch=linux-rhel7-skylake
- z3q5f3x ^[email protected]%[email protected]~doc+ncurses+openssl+ownlibs~qt build_type=Release arch=linux-rhel7-skylake
[+] x2m7t35 ^coreneuron@develop%[email protected]~caliper~codegenopt~gpu~ipo~ispc~knl~legacy-unit+mpi+nmodl~openmp~profile+report+shared~sympy~sympyopt+tests~unified build_type=Debug arch=linux-rhel7-skylake
[^] dl7pfht ^[email protected]%[email protected]+atomic+chrono~clanglibcpp~container~context~coroutine+date_time~debug+exception~fiber+filesystem+graph~icu+iostreams+locale+log+math~mpi+multithreaded~numpy+pic+program_options~python+random+regex+serialization+shared+signals~singlethreaded+system~taggedlayout+test+thread+timer~versionedlayout+wave cxxstd=98 visibility=hidden arch=linux-rhel7-skylake
[^] hyunzkn ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] vutuzbn ^[email protected]%[email protected]~ipo+mpi build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] utj47fz ^[email protected]%[email protected]+cxx~fortran+hl~ipo~java+mpi+shared~szip~threadsafe+tools api=default build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ihxi5rl ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] evtnqzd ^[email protected]%[email protected]+optimize+pic+shared arch=linux-rhel7-skylake
[^] x72vveu ^[email protected]%[email protected]~ipo+shared build_type=RelWithDebInfo arch=linux-rhel7-skylake
[+] 7hnncpj ^nmodl@develop%[email protected]~ipo~legacy-unit~llvm~llvm_cuda~python build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] onq3mhg ^[email protected]%[email protected]~i18n arch=linux-rhel7-skylake
[^] 4ectow5 ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] 72xzp3v ^[email protected]%[email protected]+bz2+ctypes+dbm~debug+libxml2+lzma~nis~optimizations+pic+pyexpat+pythoncmd+readline+shared+sqlite3+ssl~tix~tkinter~ucs4+uuid+zlib patches=0d98e93189bc278fbc37a50ed7f183bd8aaf249a8e1670a465f0db6bb4f8cf87,4c2457325f2b608b1b6a2c63087df8c26e07db3e3d493caf36a56f0ecf6fb768,f2fd060afc4b4618fe8104c4c5d771f36dc55b1db5a4623785a4ea707ec72fb4 arch=linux-rhel7-skylake
[^] 22arfs4 ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ascbeii ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] kvw3vhm ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] 7iyiygo ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] c7qvw2q ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] mazoiox ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] y7rfzdj ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] v4z3s5e ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] w4gddqx ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ci5oe5b ^[email protected]%[email protected]+libyaml arch=linux-rhel7-skylake
[^] xj2wlac ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] xtett6q ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] dzb2mfs ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] 5fkun4i ^[email protected]%[email protected]~ipo~profile+shared~tests build_type=RelWithDebInfo arch=linux-rhel7-skylake
- s4ueg72 ^[email protected]%[email protected]+lex~nls arch=linux-rhel7-skylake
[^] iu2b5hx ^[email protected]%[email protected]+bzip2+curses+git~libunistring+libxml2+tar+xz arch=linux-rhel7-skylake
[^] 3rmq3zx ^[email protected]%[email protected]~debug~pic+shared arch=linux-rhel7-skylake
[^] hxxlexb ^[email protected]%[email protected] libs=shared,static arch=linux-rhel7-skylake
[^] dnxqn2k ^[email protected]%[email protected]~python arch=linux-rhel7-skylake
[^] jzpqn5y ^[email protected]%[email protected]~pic libs=shared,static arch=linux-rhel7-skylake
[^] ams67cx ^[email protected]%[email protected]~symlinks+termlib abi=none arch=linux-rhel7-skylake
[^] ir7xtbl ^[email protected]%[email protected] patches=08921fcbd732050c74ddf1de7d8ad95ffdbc09f8b4342456fa2f6a0dd02a957c,125cd6142fac2cc339e9aebfe79e40f90766022b8e8401532b1729e84fc148c2,5c314db58d005043bf407abaf25eb9823b9032a22fd12a0b142d4bf548130fa4,d428578be7fb99b831eb61e53b8d88a859afe08b479a21238180899707d79ce4 arch=linux-rhel7-skylake
- 6ggc5yr ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] n6q4vfz ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] gjm7kkq ^[email protected]%[email protected]+blas+lapack patches=8a9d5d1b3f145c043b8b04869e7d46c6ff95c3f486d84f69693017c7e6190c7d arch=linux-rhel7-skylake
[^] r5cvgru ^[email protected]%[email protected]~ilp64+shared threads=none arch=linux-rhel7-skylake
[^] h2fsi6i ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] q5n7ofc ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] fudvy5v ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] 5abs2t7 ^[email protected]%[email protected] arch=linux-rhel7-skylake
$ spack ${SPACK_INSTALL_EXTRA_FLAGS} install -j${SLURM_CPUS_PER_TASK} --log-format=junit --log-file=${CI_PROJECT_DIR}/install.xml --keep-stage ${SPACK_FULL_SPEC} || install_failed=1
==> [email protected] : has external module in ['bison/3.8.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/bison-3.8.2-ej2cew (external bison-3.8.2-dwdch6bmdeclr2novthsywtrryotawwz)
==> [email protected] : has external module in ['cmake/3.21.4']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/cmake-3.21.4-cdyb7k (external cmake-3.21.4-z3q5f3xwpuibd3qbgdscqmu3efarbu42)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/boost-1.78.0-hiorc7 (external boost-1.78.0-dl7pfhtsvyaq3273gr2g3l5vr37eeydc)
==> [email protected] : has external module in ['hpe-mpi/2.25.hmpt']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/hpe-mpi-2.25.hmpt-4ukyxt (external hpe-mpi-2.25.hmpt-hyunzknktdzri34bx26bupkvapehonql)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/pkgconf-1.8.0-ihxi5r
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/zlib-1.2.11-mswjss (external zlib-1.2.11-evtnqzdtuvprwnbd2nljimpzywrhw4uo)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/spdlog-1.9.2-x72vve
==> [email protected] : has external module in ['python/3.9.7']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/python-3.9.7-yj5alh (external python-3.9.7-72xzp3vgzh5b424qevkfruhs3ajzqbiy)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/libyaml-0.2.5-xj2wla
==> [email protected] : has external module in ['flex/2.6.3']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/flex-2.6.3-3mtl7j (external flex-2.6.3-s4ueg72j7l6vkdyvfxj2tweo7v7s3otx)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/bzip2-1.0.8-eu2d6e (external bzip2-1.0.8-3rmq3zxuntsiphthnvty6gtydbmbkwr5)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/libiconv-1.16-hxxlex
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/xz-5.2.5-jzpqn5
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ncurses-6.2-nsve5x (external ncurses-6.2-ams67cxbq5vc7wiay2ndr2ksce2igbfw)
[+] /usr (external tar-1.28-ir7xtblauhq3vtkpjrl7ou3nzevcsi3u)
==> [email protected] : has external module in ['ninja/1.10.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ninja-1.10.2-eh33rh (external ninja-1.10.2-6ggc5yre7qddwxdjmn7sfptpdoiy4dtp)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/intel-mkl-2020.4.304-rzr3hj (external intel-mkl-2020.4.304-r5cvgrurdnt6y267arsmmmgqi75ouxd2)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/readline-8.1-we2frg (external readline-8.1-5abs2t7f6dtgbejv6kjzd4gjwhdlr7qr)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/reportinglib-2.5.6-5fkun4
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/hdf5-1.10.7-utj47f
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-markupsafe-2.0.1-4ectow
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pluggy-0.13.0-mazoio
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pyparsing-2.4.7-c7qvw2
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-iniconfig-1.1.1-kvw3vh
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-setuptools-57.4.0-v4z3s5
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-attrs-21.2.0-ascbei
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-toml-0.10.2-w4gddq
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-mpmath-1.1.0-dzb2mf
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-py-1.9.0-y7rfzd
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pyyaml-5.3.1-ci5oe5
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/libxml2-2.9.12-dnxqn2
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/libsonata-report-1.1-vutuzb
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-jinja2-3.0.1-onq3mh
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-packaging-21.0-7iyiyg
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-mpi4py-3.1.2-n6q4vf
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-coverage-5.5-fudvy5
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-cython-0.29.24-h2fsi6
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-sympy-1.8-xtett6
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/gettext-0.21-iu2b5h
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pytest-6.2.4-22arfs
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-numpy-1.19.5-gjm7kk
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/software/install_gcc-11.2.0-skylake/nmodl-develop-7hnncp
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pytest-cov-2.8.1-q5n7of
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/software/install_intel-2021.4.0-skylake/coreneuron-develop-x2m7t3
==> Installing neuron-develop-4ljxz6sigfw2sqg4xtopvwa3qqcjfsk4
==> No binary for neuron-develop-4ljxz6sigfw2sqg4xtopvwa3qqcjfsk4 found: installing from source
==> Warning: Expected user 904556 to own /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472, but it is owned by 0
==> Applied patch /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/bluebrain/repo-patches/packages/neuron/revert_Import3d_numerical_format.master.patch
==> neuron: Executing phase: 'cmake'
==> neuron: Executing phase: 'build'
==> neuron: Executing phase: 'install'
==> neuron: Successfully installed neuron-develop-4ljxz6sigfw2sqg4xtopvwa3qqcjfsk4
Fetch: 39.72s. Build: 19m 51.35s. Total: 20m 31.07s.
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/software/install_intel-2021.4.0-skylake/neuron-develop-4ljxz6
$ chmod -R g+rX "${SPACK_BUILD}"
$ if [[ ${install_failed} == 1 ]]; then exit 1; fi
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ ccache --cleanup
$ ccache --show-stats --verbose
Summary:
Cache directory: /nvme/bbpcihpcproj12/379274/ccache
Primary config: /nvme/bbpcihpcproj12/379274/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Apr 13 12:21:57 2022
Hits: 702 / 800 (87.75 %)
Direct: 169 / 809 (20.89 %)
Preprocessed: 533 / 634 (84.07 %)
Misses: 98
Direct: 640
Preprocessed: 101
Uncacheable: 126
Primary storage:
Hits: 1175 / 1612 (72.89 %)
Misses: 437
Cache size (GB): 0.47 / 0.51 (92.26 %)
Files: 26871
Uncacheable:
Autoconf compile/link: 7
Called for linking: 103
Called for preprocessing: 1
Compilation failed: 3
No input file: 6
Preprocessing failed: 6
$ tar -C "${CCACHE_DIR}" -cf "${CI_PROJECT_DIR}/ccache.tar" .
$ fi
$ touch ${SPACK_STAGE_DIR}/spack-configure-args.txt
$ cp ${SPACK_STAGE_DIR}/spack-{build-env,build-out,configure-args}.txt ${CI_PROJECT_DIR}/
$ echo "SPACK_BUILD_DIR=${SPACK_BUILD_DIR}" > ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_FULL_SPEC=${SPACK_FULL_SPEC}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_SOURCE_DIR=${SPACK_SOURCE_DIR}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_INSTALLED_HASH=${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB=^/${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ num_failures=$(module load unstable python-dev; python -c "from lxml import etree; xml = etree.parse('${CI_PROJECT_DIR}/install.xml'); print(sum(1 for _ in xml.getroot().iter('failure')) + sum(1 for _ in xml.getroot().iter('error')))")
Autoloading python/3.9.7
Autoloading py-virtualenv/16.7.6
Autoloading py-scipy/1.7.1
Autoloading py-numpy/1.19.5
Autoloading py-mpi4py/3.1.2
Autoloading hpe-mpi/2.25.hmpt
Autoloading py-h5py/3.4.0
Autoloading hdf5/1.10.7
$ if [[ ${num_failures} > 0 ]]; then exit ${num_failures}; fi
section_end:1649845322:step_script section_start:1649845322:archive_cache Saving cache for successful job
Creating cache build:neuron:nmodl:intel-8...
Runtime platform  arch=amd64 os=linux pid=94758 revision=58ba2b95 version=14.2.0
ccache.tar: found 1 matching files and directories
Uploading cache.zip to https://bbpobjectstorage.epfl.ch/gitlab-runners-cache/project/112/build%3Aneuron%3Anmodl%3Aintel-8
Created cache
section_end:1649845342:archive_cache section_start:1649845342:upload_artifacts_on_success Uploading artifacts for successful job
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=96775 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
initial_environment.env: found 1 matching files and directories
spack-build-env.txt: found 1 matching files and directories
spack-build-out.txt: found 1 matching files and directories
spack-configure-args.txt: found 1 matching files and directories
Uploading artifacts as "archive" to coordinator... ok id=211029 responseStatus=201 Created token=zngzupmC
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=96837 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
Uploading artifacts as "junit" to coordinator... ok id=211029 responseStatus=201 Created token=zngzupmC
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=96887 revision=58ba2b95 version=14.2.0
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
Uploading artifacts as "dotenv" to coordinator... ok id=211029 responseStatus=201 Created token=zngzupmC
section_end:1649845343:upload_artifacts_on_success section_start:1649845343:cleanup_file_variables Cleaning up project directory and file based variables
section_end:1649845344:cleanup_file_variables Job succeeded
Running with gitlab-runner 14.3.2 (e0218c92)
 on BB5 map runner pnPo3yJy
section_start:1649844032:resolve_secrets Resolving secrets
section_end:1649844032:resolve_secrets section_start:1649844032:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 14.3.2, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor724833531, slurm job id
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 48472, build ref 9a466a6e2a1c807e4d6045f3fc0b1fbaf51370b6, job ID 211027
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472, optional exclusive flag , optional cpus per task flag --cpus-per-task=8, optional qos flag
A slurm job will be created with name GL_J211027_PROD_P112_CP3_C2
Job parameters: memory=76G, cpus_per_task=8, duration=2:00:00, constraint=cpu ntasks=2 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
sbatch: INFO: Job specifies cpu constraint, setting --constraint=[skl|clx]
Submitted batch job 379282
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=2 --cpus-per-task=8 --mem=76G --job-name=GL_J211027_PROD_P112_CP3_C2 -C cpu --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472 --time=2:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472 --ntasks=2 --jobid=379282 --cpus-per-task=8 --mem=76G
section_end:1649844034:prepare_executor section_start:1649844034:prepare_script Preparing environment
Running on r1i4n0 via bbpv1.epfl.ch...
section_end:1649844036:prepare_script section_start:1649844036:get_sources Getting source from Git repository
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1649844037:get_sources section_start:1649844037:restore_cache Restoring cache
Checking cache for build:neuron:nmodl:nvhpc:acc-8...
Runtime platform  arch=amd64 os=linux pid=51849 revision=58ba2b95 version=14.2.0
cache.zip is up to date 
Successfully extracted cache
section_end:1649844044:restore_cache section_start:1649844044:download_artifacts Downloading artifacts
Downloading artifacts for build:coreneuron:nmodl:nvhpc:acc (211022)...
Runtime platform  arch=amd64 os=linux pid=52139 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=211022 responseStatus=200 OK token=T5xFNgyz
section_end:1649844045:download_artifacts section_start:1649844045:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
$ SPACK_BUILD="${PWD}/spack-build"
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ export SPACK_USER_CONFIG_PATH=${PWD}/spack-config
$ mkdir ${SPACK_USER_CONFIG_PATH}
$ cat > ${SPACK_USER_CONFIG_PATH}/config.yaml << END_SCRIPT # collapsed multi-line command
$ spack config blame config
--- config:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211027/spack-config/config.yaml:2 build_stage:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211027/spack-config/config.yaml:3 - /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211027/spack-build
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211027/spack-config/config.yaml:4 source_cache: /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211027/spack-source-cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/config.yaml:2 ccache: True
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:2 install_tree:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:3 root: $user_cache_path/software
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:4 projections:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:5 all: install_{compiler.name}-{compiler.version}-{target}/{name}-{version}-{hash:6}
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:6 module_roots:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:7 tcl: $user_cache_path/modules
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:9 environments:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:10 root: $user_cache_path/environments
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:88 concretization: separately
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:12 build_jobs: 8
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:18 extensions:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:18 - $spack/bluebrain/spack-scripting
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:19 # This is the path to the root of the Spack install tree.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:19 # You can use $spack here to refer to the root of the spack instance.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:37 template_dirs:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:37 - $spack/share/spack/templates
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Temporary locations Spack can try to use for builds.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Recommended options are given below.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Builds can be faster in temporary directories on some (e.g., HPC) systems.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Specifying `$tempdir` will ensure use of the default temporary directory
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # (i.e., ``$TMP` or ``$TMPDIR``).
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Another option that prevents conflicts and potential permission issues is
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # to specify `$user_cache_path/stage`, which ensures each user builds in their
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # home directory.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # A more traditional path uses the value of `$spack/var/spack/stage`, which
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # builds directly inside Spack's instance without staging them in a
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # temporary space. Problems with specifying a path inside a Spack instance
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # are that it precludes its use as a system package and its ability to be
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # pip installable.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # In any case, if the username is not already in the path, Spack will append
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # the value of `$user` in an attempt to avoid potential conflicts between
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # users in shared temporary spaces.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # The build stage can be purged with `spack clean --stage` and
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # `spack clean -a`, so it is important that the specified directory uniquely
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # identifies Spack staging to avoid accidentally wiping out non-Spack work.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:73 test_stage: $user_cache_path/test
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:82 misc_cache: $user_cache_path/cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:94 connect_timeout: 10
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:99 verify_ssl: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:107 suppress_gpg_warnings: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:113 install_missing_compilers: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:118 checksum: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:123 deprecated: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:128 dirty: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:136 build_language: C
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:145 locks: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:150 url_fetch_method: urllib
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:176 concretizer: clingo
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:183 db_lock_timeout: 300
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:191 package_lock_timeout: null
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:197 shared_linking: rpath
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:202 allow_sgid: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:207 terminal_title: False
_builtin debug: False
$ export XDG_CONFIG_HOME=${CI_BUILDS_DIR}/J${CI_JOB_ID}_local_config
$ echo "Configuring git to use CI_JOB_TOKEN to access [email protected] (${XDG_CONFIG_HOME})"
Configuring git to use CI_JOB_TOKEN to access [email protected] (/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472/J211027_local_config)
$ mkdir -p "${XDG_CONFIG_HOME}/git"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = [email protected]:" > "${XDG_CONFIG_HOME}/git/config"
$ cat "${XDG_CONFIG_HOME}/git/config"
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = [email protected]:
$ SPACK_FULL_SPEC="${SPACK_PACKAGE}${SPACK_PACKAGE_COMPILER:+%}${SPACK_PACKAGE_COMPILER} ${SPACK_PACKAGE_SPEC} ${SPACK_PACKAGE_DEPENDENCIES} ${SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB}"
$ echo "Preparing to install ${SPACK_FULL_SPEC}"
Preparing to install neuron%nvhpc +coreneuron+debug+tests~legacy-unit~rx3d model_tests=channel-benchmark,olfactory,tqperf-heavy ^/7eavmcfdqg53mknhnfbcd3xnco2pxk7o
$ JSON_SPEC=$(spack spec --json ${SPACK_FULL_SPEC})
$ SPACK_INSTALLED_HASH=$(module load unstable python; echo "${JSON_SPEC}" | python -c "import json, sys; print(json.loads(sys.stdin.read())[\"spec\"][\"nodes\"][0][\"hash\"])")
$ echo "Determined its hash will be ${SPACK_INSTALLED_HASH}"
Determined its hash will be aw3evwlkdey4vmmjzpsuscd3bnfgndfy
$ SPACK_STAGE_DIR=${SPACK_BUILD}/spack-stage-${SPACK_PACKAGE}-develop-${SPACK_INSTALLED_HASH}
$ SPACK_BUILD_DIR=${SPACK_STAGE_DIR}/spack-build-${SPACK_INSTALLED_HASH:0:7}
$ SPACK_SOURCE_DIR=${SPACK_STAGE_DIR}/spack-src
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ module load unstable ccache
$ export CCACHE_BASEDIR=$(realpath -P ${CI_BUILDS_DIR})
$ echo CCACHE_BASEDIR=${CCACHE_BASEDIR}
CCACHE_BASEDIR=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472
$ export CCACHE_MAXSIZE=512M
$ export CCACHE_DIR="${TMPDIR}/ccache"
$ mkdir -p ${CCACHE_DIR}
$ if [ -f ${CI_PROJECT_DIR}/ccache.tar ]; then
$ tar -C "${CCACHE_DIR}" -xf "${CI_PROJECT_DIR}/ccache.tar"
$ fi
$ ccache --zero-stats
Statistics zeroed
$ ccache --show-stats --verbose
Summary:
Cache directory: /nvme/bbpcihpcproj12/379282/ccache
Primary config: /nvme/bbpcihpcproj12/379282/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Apr 13 12:01:09 2022
Hits: 0 / 0
Direct: 0 / 0
Preprocessed: 0 / 0
Misses: 0
Direct: 0
Preprocessed: 0
Primary storage:
Hits: 0 / 0
Misses: 0
Cache size (GB): 0.47 / 0.51 (91.18 %)
Files: 27085
$ fi
$ module load unstable git ${SPACK_EXTRA_MODULES}
$ spack uninstall -y --dependents /${SPACK_INSTALLED_HASH} || true
==> Error: No installed spec matches the hash: 'aw3evwlkdey4vmmjzpsuscd3bnfgndfy'
$ spack spec -Il ${SPACK_FULL_SPEC}
Input spec
--------------------------------
- neuron%nvhpc+coreneuron+debug~legacy-unit~rx3d+tests model_tests=channel-benchmark,olfactory,tqperf-heavy
[+] ^coreneuron@develop%[email protected]~caliper~codegenopt+gpu~ipo~ispc~knl~legacy-unit+mpi+nmodl~openmp~profile+report+shared+sympy~sympyopt+tests~unified build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]+atomic+chrono~clanglibcpp~container~context~coroutine+date_time~debug+exception~fiber+filesystem+graph~icu+iostreams+locale+log+math~mpi+multithreaded~numpy+pic+program_options~python+random+regex+serialization+shared+signals~singlethreaded+system~taggedlayout+test+thread+timer~versionedlayout+wave cxxstd=98 visibility=hidden arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]~allow-unsupported-compilers~dev arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]~ipo+mpi build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]+cxx~fortran+hl~ipo~java+mpi+shared~szip~threadsafe+tools api=default build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]+optimize+pic+shared arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]~ipo+shared build_type=RelWithDebInfo arch=linux-rhel7-skylake
[+] ^nmodl@develop%[email protected]~ipo~legacy-unit~llvm~llvm_cuda~python build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]~i18n arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]+bz2+ctypes+dbm~debug+libxml2+lzma~nis~optimizations+pic+pyexpat+pythoncmd+readline+shared+sqlite3+ssl~tix~tkinter~ucs4+uuid+zlib patches=0d98e93189bc278fbc37a50ed7f183bd8aaf249a8e1670a465f0db6bb4f8cf87,4c2457325f2b608b1b6a2c63087df8c26e07db3e3d493caf36a56f0ecf6fb768,f2fd060afc4b4618fe8104c4c5d771f36dc55b1db5a4623785a4ea707ec72fb4 arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]+libyaml arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]~ipo~profile+shared~tests build_type=RelWithDebInfo arch=linux-rhel7-skylake
Concretized
--------------------------------
- aw3evwl neuron@develop%[email protected]+binary~caliper+cmake~codechecks+coreneuron~cross-compile+debug~interviews~ipo+legacy-fr~legacy-unit+memacs+mod-compatibility+mpi+multisend~profile+pysetup+python~rx3d+shared+tests build_type=RelWithDebInfo model_tests=channel-benchmark,olfactory,tqperf-heavy patches=708cb04826b394a858069d93e8c08e1e81e914c23e1ef3da0486e8233834ff6c arch=linux-rhel7-skylake
- 6s6wcfe ^[email protected]%[email protected] arch=linux-rhel7-skylake
- ucwiakr ^[email protected]%[email protected]~doc+ncurses+openssl+ownlibs~qt build_type=Release arch=linux-rhel7-skylake
[+] 7eavmcf ^coreneuron@develop%[email protected]~caliper~codegenopt+gpu~ipo~ispc~knl~legacy-unit+mpi+nmodl~openmp~profile+report+shared+sympy~sympyopt+tests~unified build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] vn2t5vi ^[email protected]%[email protected]+atomic+chrono~clanglibcpp~container~context~coroutine+date_time~debug+exception~fiber+filesystem+graph~icu+iostreams+locale+log+math~mpi+multithreaded~numpy+pic+program_options~python+random+regex+serialization+shared+signals~singlethreaded+system~taggedlayout+test+thread+timer~versionedlayout+wave cxxstd=98 visibility=hidden arch=linux-rhel7-skylake
[^] gi5x2dn ^[email protected]%[email protected]~allow-unsupported-compilers~dev arch=linux-rhel7-skylake
[^] hyunzkn ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] vutuzbn ^[email protected]%[email protected]~ipo+mpi build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] utj47fz ^[email protected]%[email protected]+cxx~fortran+hl~ipo~java+mpi+shared~szip~threadsafe+tools api=default build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ihxi5rl ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] evtnqzd ^[email protected]%[email protected]+optimize+pic+shared arch=linux-rhel7-skylake
[^] x72vveu ^[email protected]%[email protected]~ipo+shared build_type=RelWithDebInfo arch=linux-rhel7-skylake
[+] 7hnncpj ^nmodl@develop%[email protected]~ipo~legacy-unit~llvm~llvm_cuda~python build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] onq3mhg ^[email protected]%[email protected]~i18n arch=linux-rhel7-skylake
[^] 4ectow5 ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] 72xzp3v ^[email protected]%[email protected]+bz2+ctypes+dbm~debug+libxml2+lzma~nis~optimizations+pic+pyexpat+pythoncmd+readline+shared+sqlite3+ssl~tix~tkinter~ucs4+uuid+zlib patches=0d98e93189bc278fbc37a50ed7f183bd8aaf249a8e1670a465f0db6bb4f8cf87,4c2457325f2b608b1b6a2c63087df8c26e07db3e3d493caf36a56f0ecf6fb768,f2fd060afc4b4618fe8104c4c5d771f36dc55b1db5a4623785a4ea707ec72fb4 arch=linux-rhel7-skylake
[^] 22arfs4 ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ascbeii ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] kvw3vhm ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] 7iyiygo ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] c7qvw2q ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] mazoiox ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] y7rfzdj ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] v4z3s5e ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] w4gddqx ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ci5oe5b ^[email protected]%[email protected]+libyaml arch=linux-rhel7-skylake
[^] xj2wlac ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] xtett6q ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] dzb2mfs ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] 5fkun4i ^[email protected]%[email protected]~ipo~profile+shared~tests build_type=RelWithDebInfo arch=linux-rhel7-skylake
- ajxdymq ^[email protected]%[email protected]+lex~nls arch=linux-rhel7-skylake
[^] iu2b5hx ^[email protected]%[email protected]+bzip2+curses+git~libunistring+libxml2+tar+xz arch=linux-rhel7-skylake
[^] 3rmq3zx ^[email protected]%[email protected]~debug~pic+shared arch=linux-rhel7-skylake
[^] hxxlexb ^[email protected]%[email protected] libs=shared,static arch=linux-rhel7-skylake
[^] dnxqn2k ^[email protected]%[email protected]~python arch=linux-rhel7-skylake
[^] jzpqn5y ^[email protected]%[email protected]~pic libs=shared,static arch=linux-rhel7-skylake
[^] ams67cx ^[email protected]%[email protected]~symlinks+termlib abi=none arch=linux-rhel7-skylake
[^] ir7xtbl ^[email protected]%[email protected] patches=08921fcbd732050c74ddf1de7d8ad95ffdbc09f8b4342456fa2f6a0dd02a957c,125cd6142fac2cc339e9aebfe79e40f90766022b8e8401532b1729e84fc148c2,5c314db58d005043bf407abaf25eb9823b9032a22fd12a0b142d4bf548130fa4,d428578be7fb99b831eb61e53b8d88a859afe08b479a21238180899707d79ce4 arch=linux-rhel7-skylake
- cp3ofsp ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] n6q4vfz ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] gjm7kkq ^[email protected]%[email protected]+blas+lapack patches=8a9d5d1b3f145c043b8b04869e7d46c6ff95c3f486d84f69693017c7e6190c7d arch=linux-rhel7-skylake
[^] r5cvgru ^[email protected]%[email protected]~ilp64+shared threads=none arch=linux-rhel7-skylake
[^] h2fsi6i ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] q5n7ofc ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] fudvy5v ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] 3333cwm ^[email protected]%[email protected] arch=linux-rhel7-skylake
$ spack ${SPACK_INSTALL_EXTRA_FLAGS} install -j${SLURM_CPUS_PER_TASK} --log-format=junit --log-file=${CI_PROJECT_DIR}/install.xml --keep-stage ${SPACK_FULL_SPEC} || install_failed=1
==> [email protected] : has external module in ['bison/3.8.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/bison-3.8.2-ej2cew (external bison-3.8.2-6s6wcfeanp2mkdib4c3n3ivkcuosopgm)
==> [email protected] : has external module in ['cmake/3.21.4']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/cmake-3.21.4-cdyb7k (external cmake-3.21.4-ucwiakreeghdgbo22kbqqhgnnlwxqtnn)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/boost-1.78.0-hiorc7 (external boost-1.78.0-vn2t5viljvtb5wyfhgmx57txyeiznxo4)
==> [email protected] : has external module in ['cuda/11.6.1']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/cuda-11.6.1-ngetva (external cuda-11.6.1-gi5x2dn47i7nv76flbq7aatllxpaqmjp)
==> [email protected] : has external module in ['hpe-mpi/2.25.hmpt']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/hpe-mpi-2.25.hmpt-4ukyxt (external hpe-mpi-2.25.hmpt-hyunzknktdzri34bx26bupkvapehonql)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/pkgconf-1.8.0-ihxi5r
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/zlib-1.2.11-mswjss (external zlib-1.2.11-evtnqzdtuvprwnbd2nljimpzywrhw4uo)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/spdlog-1.9.2-x72vve
==> [email protected] : has external module in ['python/3.9.7']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/python-3.9.7-yj5alh (external python-3.9.7-72xzp3vgzh5b424qevkfruhs3ajzqbiy)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/libyaml-0.2.5-xj2wla
==> [email protected] : has external module in ['flex/2.6.3']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/flex-2.6.3-3mtl7j (external flex-2.6.3-ajxdymqhnzqgeuhvk5nu5zfymzq35n6i)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/bzip2-1.0.8-eu2d6e (external bzip2-1.0.8-3rmq3zxuntsiphthnvty6gtydbmbkwr5)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/libiconv-1.16-hxxlex
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/xz-5.2.5-jzpqn5
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ncurses-6.2-nsve5x (external ncurses-6.2-ams67cxbq5vc7wiay2ndr2ksce2igbfw)
[+] /usr (external tar-1.28-ir7xtblauhq3vtkpjrl7ou3nzevcsi3u)
==> [email protected] : has external module in ['ninja/1.10.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ninja-1.10.2-eh33rh (external ninja-1.10.2-cp3ofspgkdqaqih5vrhotmdwvkozfswp)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/intel-mkl-2020.4.304-rzr3hj (external intel-mkl-2020.4.304-r5cvgrurdnt6y267arsmmmgqi75ouxd2)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/readline-8.1-we2frg (external readline-8.1-3333cwmykvahrsdydir4qeyasic3liq6)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/reportinglib-2.5.6-5fkun4
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/hdf5-1.10.7-utj47f
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-toml-0.10.2-w4gddq
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pluggy-0.13.0-mazoio
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-attrs-21.2.0-ascbei
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-markupsafe-2.0.1-4ectow
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-iniconfig-1.1.1-kvw3vh
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-py-1.9.0-y7rfzd
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pyparsing-2.4.7-c7qvw2
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-setuptools-57.4.0-v4z3s5
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-mpmath-1.1.0-dzb2mf
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pyyaml-5.3.1-ci5oe5
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/libxml2-2.9.12-dnxqn2
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/libsonata-report-1.1-vutuzb
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-jinja2-3.0.1-onq3mh
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-packaging-21.0-7iyiyg
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-mpi4py-3.1.2-n6q4vf
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-cython-0.29.24-h2fsi6
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-coverage-5.5-fudvy5
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-sympy-1.8-xtett6
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/gettext-0.21-iu2b5h
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pytest-6.2.4-22arfs
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-numpy-1.19.5-gjm7kk
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pytest-cov-2.8.1-q5n7of
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/software/install_gcc-11.2.0-skylake/nmodl-develop-7hnncp
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/software/install_nvhpc-22.3-skylake/coreneuron-develop-7eavmc
==> Installing neuron-develop-aw3evwlkdey4vmmjzpsuscd3bnfgndfy
==> No binary for neuron-develop-aw3evwlkdey4vmmjzpsuscd3bnfgndfy found: installing from source
==> Warning: Expected user 904556 to own /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472, but it is owned by 0
==> Applied patch /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/bluebrain/repo-patches/packages/neuron/revert_Import3d_numerical_format.master.patch
==> neuron: Executing phase: 'cmake'
==> neuron: Executing phase: 'build'
==> neuron: Executing phase: 'install'
==> neuron: Successfully installed neuron-develop-aw3evwlkdey4vmmjzpsuscd3bnfgndfy
Fetch: 22.45s. Build: 28m 18.56s. Total: 28m 41.00s.
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/software/install_nvhpc-22.3-skylake/neuron-develop-aw3evw
$ chmod -R g+rX "${SPACK_BUILD}"
$ if [[ ${install_failed} == 1 ]]; then exit 1; fi
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ ccache --cleanup
$ ccache --show-stats --verbose
Summary:
Cache directory: /nvme/bbpcihpcproj12/379282/ccache
Primary config: /nvme/bbpcihpcproj12/379282/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Apr 13 12:30:46 2022
Hits: 702 / 799 (87.86 %)
Direct: 169 / 808 (20.92 %)
Preprocessed: 533 / 633 (84.20 %)
Misses: 97
Direct: 639
Preprocessed: 100
Uncacheable: 119
Primary storage:
Hits: 1175 / 1610 (72.98 %)
Misses: 435
Cache size (GB): 0.45 / 0.51 (88.12 %)
Files: 26255
Cleanups: 3
Uncacheable:
Autoconf compile/link: 7
Called for linking: 96
Called for preprocessing: 1
Compilation failed: 3
No input file: 6
Preprocessing failed: 6
$ tar -C "${CCACHE_DIR}" -cf "${CI_PROJECT_DIR}/ccache.tar" .
$ fi
$ touch ${SPACK_STAGE_DIR}/spack-configure-args.txt
$ cp ${SPACK_STAGE_DIR}/spack-{build-env,build-out,configure-args}.txt ${CI_PROJECT_DIR}/
$ echo "SPACK_BUILD_DIR=${SPACK_BUILD_DIR}" > ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_FULL_SPEC=${SPACK_FULL_SPEC}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_SOURCE_DIR=${SPACK_SOURCE_DIR}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_INSTALLED_HASH=${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB=^/${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ num_failures=$(module load unstable python-dev; python -c "from lxml import etree; xml = etree.parse('${CI_PROJECT_DIR}/install.xml'); print(sum(1 for _ in xml.getroot().iter('failure')) + sum(1 for _ in xml.getroot().iter('error')))")
Autoloading python/3.9.7
Autoloading py-virtualenv/16.7.6
Autoloading py-scipy/1.7.1
Autoloading py-numpy/1.19.5
Autoloading py-mpi4py/3.1.2
Autoloading hpe-mpi/2.25.hmpt
Autoloading py-h5py/3.4.0
Autoloading hdf5/1.10.7
$ if [[ ${num_failures} > 0 ]]; then exit ${num_failures}; fi
section_end:1649845848:step_script section_start:1649845848:archive_cache Saving cache for successful job
Creating cache build:neuron:nmodl:nvhpc:acc-8...
Runtime platform  arch=amd64 os=linux pid=125157 revision=58ba2b95 version=14.2.0
ccache.tar: found 1 matching files and directories
Uploading cache.zip to https://bbpobjectstorage.epfl.ch/gitlab-runners-cache/project/112/build%3Aneuron%3Anmodl%3Anvhpc%3Aacc-8
Created cache
section_end:1649845866:archive_cache section_start:1649845866:upload_artifacts_on_success Uploading artifacts for successful job
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=125563 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
initial_environment.env: found 1 matching files and directories
spack-build-env.txt: found 1 matching files and directories
spack-build-out.txt: found 1 matching files and directories
spack-configure-args.txt: found 1 matching files and directories
Uploading artifacts as "archive" to coordinator... ok id=211027 responseStatus=201 Created token=sQF2SFDP
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=125611 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
Uploading artifacts as "junit" to coordinator... ok id=211027 responseStatus=201 Created token=sQF2SFDP
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=125667 revision=58ba2b95 version=14.2.0
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
Uploading artifacts as "dotenv" to coordinator... ok id=211027 responseStatus=201 Created token=sQF2SFDP
section_end:1649845867:upload_artifacts_on_success section_start:1649845867:cleanup_file_variables Cleaning up project directory and file based variables
section_end:1649845868:cleanup_file_variables Job succeeded
Running with gitlab-runner 14.3.2 (e0218c92)
 on BB5 map runner pnPo3yJy
section_start:1649844033:resolve_secrets Resolving secrets
section_end:1649844033:resolve_secrets section_start:1649844033:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 14.3.2, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor758797407, slurm job id
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 48472, build ref 9a466a6e2a1c807e4d6045f3fc0b1fbaf51370b6, job ID 211026
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472, optional exclusive flag , optional cpus per task flag --cpus-per-task=8, optional qos flag
A slurm job will be created with name GL_J211026_PROD_P112_CP8_C9
Job parameters: memory=76G, cpus_per_task=8, duration=2:00:00, constraint=cpu ntasks=2 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
sbatch: INFO: Job specifies cpu constraint, setting --constraint=[skl|clx]
Submitted batch job 379284
job state: PD
job state: PD
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=2 --cpus-per-task=8 --mem=76G --job-name=GL_J211026_PROD_P112_CP8_C9 -C cpu --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472 --time=2:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472 --ntasks=2 --jobid=379284 --cpus-per-task=8 --mem=76G
section_end:1649844037:prepare_executor section_start:1649844037:prepare_script Preparing environment
Running on r1i4n0 via bbpv1.epfl.ch...
section_end:1649844039:prepare_script section_start:1649844039:get_sources Getting source from Git repository
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1649844040:get_sources section_start:1649844040:restore_cache Restoring cache
Checking cache for build:neuron:nmodl:nvhpc:omp-8...
Runtime platform  arch=amd64 os=linux pid=52046 revision=58ba2b95 version=14.2.0
cache.zip is up to date 
Successfully extracted cache
section_end:1649844047:restore_cache section_start:1649844047:download_artifacts Downloading artifacts
Downloading artifacts for build:coreneuron:nmodl:nvhpc:omp (211021)...
Runtime platform  arch=amd64 os=linux pid=52640 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=211021 responseStatus=200 OK token=Koj5cr-4
section_end:1649844048:download_artifacts section_start:1649844048:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
$ SPACK_BUILD="${PWD}/spack-build"
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ export SPACK_USER_CONFIG_PATH=${PWD}/spack-config
$ mkdir ${SPACK_USER_CONFIG_PATH}
$ cat > ${SPACK_USER_CONFIG_PATH}/config.yaml << END_SCRIPT # collapsed multi-line command
$ spack config blame config
--- config:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211026/spack-config/config.yaml:2 build_stage:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211026/spack-config/config.yaml:3 - /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211026/spack-build
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211026/spack-config/config.yaml:4 source_cache: /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211026/spack-source-cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/config.yaml:2 ccache: True
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:2 install_tree:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:3 root: $user_cache_path/software
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:4 projections:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:5 all: install_{compiler.name}-{compiler.version}-{target}/{name}-{version}-{hash:6}
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:6 module_roots:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:7 tcl: $user_cache_path/modules
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:9 environments:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:10 root: $user_cache_path/environments
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:88 concretization: separately
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:12 build_jobs: 8
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:18 extensions:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:18 - $spack/bluebrain/spack-scripting
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:19 # This is the path to the root of the Spack install tree.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:19 # You can use $spack here to refer to the root of the spack instance.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:37 template_dirs:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:37 - $spack/share/spack/templates
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Temporary locations Spack can try to use for builds.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Recommended options are given below.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Builds can be faster in temporary directories on some (e.g., HPC) systems.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Specifying `$tempdir` will ensure use of the default temporary directory
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # (i.e., ``$TMP` or ``$TMPDIR``).
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Another option that prevents conflicts and potential permission issues is
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # to specify `$user_cache_path/stage`, which ensures each user builds in their
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # home directory.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # A more traditional path uses the value of `$spack/var/spack/stage`, which
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # builds directly inside Spack's instance without staging them in a
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # temporary space. Problems with specifying a path inside a Spack instance
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # are that it precludes its use as a system package and its ability to be
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # pip installable.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # In any case, if the username is not already in the path, Spack will append
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # the value of `$user` in an attempt to avoid potential conflicts between
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # users in shared temporary spaces.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # The build stage can be purged with `spack clean --stage` and
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # `spack clean -a`, so it is important that the specified directory uniquely
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # identifies Spack staging to avoid accidentally wiping out non-Spack work.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:73 test_stage: $user_cache_path/test
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:82 misc_cache: $user_cache_path/cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:94 connect_timeout: 10
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:99 verify_ssl: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:107 suppress_gpg_warnings: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:113 install_missing_compilers: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:118 checksum: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:123 deprecated: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:128 dirty: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:136 build_language: C
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:145 locks: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:150 url_fetch_method: urllib
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:176 concretizer: clingo
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:183 db_lock_timeout: 300
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:191 package_lock_timeout: null
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:197 shared_linking: rpath
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:202 allow_sgid: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:207 terminal_title: False
_builtin debug: False
$ export XDG_CONFIG_HOME=${CI_BUILDS_DIR}/J${CI_JOB_ID}_local_config
$ echo "Configuring git to use CI_JOB_TOKEN to access [email protected] (${XDG_CONFIG_HOME})"
Configuring git to use CI_JOB_TOKEN to access [email protected] (/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472/J211026_local_config)
$ mkdir -p "${XDG_CONFIG_HOME}/git"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = [email protected]:" > "${XDG_CONFIG_HOME}/git/config"
$ cat "${XDG_CONFIG_HOME}/git/config"
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = [email protected]:
$ SPACK_FULL_SPEC="${SPACK_PACKAGE}${SPACK_PACKAGE_COMPILER:+%}${SPACK_PACKAGE_COMPILER} ${SPACK_PACKAGE_SPEC} ${SPACK_PACKAGE_DEPENDENCIES} ${SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB}"
$ echo "Preparing to install ${SPACK_FULL_SPEC}"
Preparing to install neuron%nvhpc +coreneuron+debug+tests~legacy-unit~rx3d model_tests=channel-benchmark,olfactory,tqperf-heavy ^/dqfui2d7qu3mc6w6xhrktts7kg2pjeio
$ JSON_SPEC=$(spack spec --json ${SPACK_FULL_SPEC})
$ SPACK_INSTALLED_HASH=$(module load unstable python; echo "${JSON_SPEC}" | python -c "import json, sys; print(json.loads(sys.stdin.read())[\"spec\"][\"nodes\"][0][\"hash\"])")
$ echo "Determined its hash will be ${SPACK_INSTALLED_HASH}"
Determined its hash will be 7rnzemtajzavcycpixj4vl565kzy3d6v
$ SPACK_STAGE_DIR=${SPACK_BUILD}/spack-stage-${SPACK_PACKAGE}-develop-${SPACK_INSTALLED_HASH}
$ SPACK_BUILD_DIR=${SPACK_STAGE_DIR}/spack-build-${SPACK_INSTALLED_HASH:0:7}
$ SPACK_SOURCE_DIR=${SPACK_STAGE_DIR}/spack-src
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ module load unstable ccache
$ export CCACHE_BASEDIR=$(realpath -P ${CI_BUILDS_DIR})
$ echo CCACHE_BASEDIR=${CCACHE_BASEDIR}
CCACHE_BASEDIR=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472
$ export CCACHE_MAXSIZE=512M
$ export CCACHE_DIR="${TMPDIR}/ccache"
$ mkdir -p ${CCACHE_DIR}
$ if [ -f ${CI_PROJECT_DIR}/ccache.tar ]; then
$ tar -C "${CCACHE_DIR}" -xf "${CI_PROJECT_DIR}/ccache.tar"
$ fi
$ ccache --zero-stats
Statistics zeroed
$ ccache --show-stats --verbose
Summary:
Cache directory: /nvme/bbpcihpcproj12/379284/ccache
Primary config: /nvme/bbpcihpcproj12/379284/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Apr 13 12:01:13 2022
Hits: 0 / 0
Direct: 0 / 0
Preprocessed: 0 / 0
Misses: 0
Direct: 0
Preprocessed: 0
Primary storage:
Hits: 0 / 0
Misses: 0
Cache size (GB): 0.46 / 0.51 (90.07 %)
Files: 26966
$ fi
$ module load unstable git ${SPACK_EXTRA_MODULES}
$ spack uninstall -y --dependents /${SPACK_INSTALLED_HASH} || true
==> Error: No installed spec matches the hash: '7rnzemtajzavcycpixj4vl565kzy3d6v'
$ spack spec -Il ${SPACK_FULL_SPEC}
Input spec
--------------------------------
- neuron%nvhpc+coreneuron+debug~legacy-unit~rx3d+tests model_tests=channel-benchmark,olfactory,tqperf-heavy
[+] ^coreneuron@develop%[email protected]~caliper~codegenopt+gpu~ipo~ispc~knl~legacy-unit+mpi+nmodl+openmp~profile+report+shared~sympy~sympyopt+tests~unified build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]+atomic+chrono~clanglibcpp~container~context~coroutine+date_time~debug+exception~fiber+filesystem+graph~icu+iostreams+locale+log+math~mpi+multithreaded~numpy+pic+program_options~python+random+regex+serialization+shared+signals~singlethreaded+system~taggedlayout+test+thread+timer~versionedlayout+wave cxxstd=98 visibility=hidden arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]~allow-unsupported-compilers~dev arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]~ipo+mpi build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]+cxx~fortran+hl~ipo~java+mpi+shared~szip~threadsafe+tools api=default build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]+optimize+pic+shared arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]~ipo+shared build_type=RelWithDebInfo arch=linux-rhel7-skylake
[+] ^nmodl@develop%[email protected]~ipo~legacy-unit~llvm~llvm_cuda~python build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]~i18n arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]+bz2+ctypes+dbm~debug+libxml2+lzma~nis~optimizations+pic+pyexpat+pythoncmd+readline+shared+sqlite3+ssl~tix~tkinter~ucs4+uuid+zlib patches=0d98e93189bc278fbc37a50ed7f183bd8aaf249a8e1670a465f0db6bb4f8cf87,4c2457325f2b608b1b6a2c63087df8c26e07db3e3d493caf36a56f0ecf6fb768,f2fd060afc4b4618fe8104c4c5d771f36dc55b1db5a4623785a4ea707ec72fb4 arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]+libyaml arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ^[email protected]%[email protected]~ipo~profile+shared~tests build_type=RelWithDebInfo arch=linux-rhel7-skylake
Concretized
--------------------------------
- 7rnzemt neuron@develop%[email protected]+binary~caliper+cmake~codechecks+coreneuron~cross-compile+debug~interviews~ipo+legacy-fr~legacy-unit+memacs+mod-compatibility+mpi+multisend~profile+pysetup+python~rx3d+shared+tests build_type=RelWithDebInfo model_tests=channel-benchmark,olfactory,tqperf-heavy patches=708cb04826b394a858069d93e8c08e1e81e914c23e1ef3da0486e8233834ff6c arch=linux-rhel7-skylake
- 6s6wcfe ^[email protected]%[email protected] arch=linux-rhel7-skylake
- ucwiakr ^[email protected]%[email protected]~doc+ncurses+openssl+ownlibs~qt build_type=Release arch=linux-rhel7-skylake
[+] dqfui2d ^coreneuron@develop%[email protected]~caliper~codegenopt+gpu~ipo~ispc~knl~legacy-unit+mpi+nmodl+openmp~profile+report+shared~sympy~sympyopt+tests~unified build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] vn2t5vi ^[email protected]%[email protected]+atomic+chrono~clanglibcpp~container~context~coroutine+date_time~debug+exception~fiber+filesystem+graph~icu+iostreams+locale+log+math~mpi+multithreaded~numpy+pic+program_options~python+random+regex+serialization+shared+signals~singlethreaded+system~taggedlayout+test+thread+timer~versionedlayout+wave cxxstd=98 visibility=hidden arch=linux-rhel7-skylake
[^] gi5x2dn ^[email protected]%[email protected]~allow-unsupported-compilers~dev arch=linux-rhel7-skylake
[^] hyunzkn ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] vutuzbn ^[email protected]%[email protected]~ipo+mpi build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] utj47fz ^[email protected]%[email protected]+cxx~fortran+hl~ipo~java+mpi+shared~szip~threadsafe+tools api=default build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ihxi5rl ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] evtnqzd ^[email protected]%[email protected]+optimize+pic+shared arch=linux-rhel7-skylake
[^] x72vveu ^[email protected]%[email protected]~ipo+shared build_type=RelWithDebInfo arch=linux-rhel7-skylake
[+] 7hnncpj ^nmodl@develop%[email protected]~ipo~legacy-unit~llvm~llvm_cuda~python build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] onq3mhg ^[email protected]%[email protected]~i18n arch=linux-rhel7-skylake
[^] 4ectow5 ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] 72xzp3v ^[email protected]%[email protected]+bz2+ctypes+dbm~debug+libxml2+lzma~nis~optimizations+pic+pyexpat+pythoncmd+readline+shared+sqlite3+ssl~tix~tkinter~ucs4+uuid+zlib patches=0d98e93189bc278fbc37a50ed7f183bd8aaf249a8e1670a465f0db6bb4f8cf87,4c2457325f2b608b1b6a2c63087df8c26e07db3e3d493caf36a56f0ecf6fb768,f2fd060afc4b4618fe8104c4c5d771f36dc55b1db5a4623785a4ea707ec72fb4 arch=linux-rhel7-skylake
[^] 22arfs4 ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ascbeii ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] kvw3vhm ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] 7iyiygo ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] c7qvw2q ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] mazoiox ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] y7rfzdj ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] v4z3s5e ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] w4gddqx ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ci5oe5b ^[email protected]%[email protected]+libyaml arch=linux-rhel7-skylake
[^] xj2wlac ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] xtett6q ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] dzb2mfs ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] 5fkun4i ^[email protected]%[email protected]~ipo~profile+shared~tests build_type=RelWithDebInfo arch=linux-rhel7-skylake
- ajxdymq ^[email protected]%[email protected]+lex~nls arch=linux-rhel7-skylake
[^] iu2b5hx ^[email protected]%[email protected]+bzip2+curses+git~libunistring+libxml2+tar+xz arch=linux-rhel7-skylake
[^] 3rmq3zx ^[email protected]%[email protected]~debug~pic+shared arch=linux-rhel7-skylake
[^] hxxlexb ^[email protected]%[email protected] libs=shared,static arch=linux-rhel7-skylake
[^] dnxqn2k ^[email protected]%[email protected]~python arch=linux-rhel7-skylake
[^] jzpqn5y ^[email protected]%[email protected]~pic libs=shared,static arch=linux-rhel7-skylake
[^] ams67cx ^[email protected]%[email protected]~symlinks+termlib abi=none arch=linux-rhel7-skylake
[^] ir7xtbl ^[email protected]%[email protected] patches=08921fcbd732050c74ddf1de7d8ad95ffdbc09f8b4342456fa2f6a0dd02a957c,125cd6142fac2cc339e9aebfe79e40f90766022b8e8401532b1729e84fc148c2,5c314db58d005043bf407abaf25eb9823b9032a22fd12a0b142d4bf548130fa4,d428578be7fb99b831eb61e53b8d88a859afe08b479a21238180899707d79ce4 arch=linux-rhel7-skylake
- cp3ofsp ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] n6q4vfz ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] gjm7kkq ^[email protected]%[email protected]+blas+lapack patches=8a9d5d1b3f145c043b8b04869e7d46c6ff95c3f486d84f69693017c7e6190c7d arch=linux-rhel7-skylake
[^] r5cvgru ^[email protected]%[email protected]~ilp64+shared threads=none arch=linux-rhel7-skylake
[^] h2fsi6i ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] q5n7ofc ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] fudvy5v ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] 3333cwm ^[email protected]%[email protected] arch=linux-rhel7-skylake
$ spack ${SPACK_INSTALL_EXTRA_FLAGS} install -j${SLURM_CPUS_PER_TASK} --log-format=junit --log-file=${CI_PROJECT_DIR}/install.xml --keep-stage ${SPACK_FULL_SPEC} || install_failed=1
==> [email protected] : has external module in ['bison/3.8.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/bison-3.8.2-ej2cew (external bison-3.8.2-6s6wcfeanp2mkdib4c3n3ivkcuosopgm)
==> [email protected] : has external module in ['cmake/3.21.4']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/cmake-3.21.4-cdyb7k (external cmake-3.21.4-ucwiakreeghdgbo22kbqqhgnnlwxqtnn)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/boost-1.78.0-hiorc7 (external boost-1.78.0-vn2t5viljvtb5wyfhgmx57txyeiznxo4)
==> [email protected] : has external module in ['cuda/11.6.1']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/cuda-11.6.1-ngetva (external cuda-11.6.1-gi5x2dn47i7nv76flbq7aatllxpaqmjp)
==> [email protected] : has external module in ['hpe-mpi/2.25.hmpt']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/hpe-mpi-2.25.hmpt-4ukyxt (external hpe-mpi-2.25.hmpt-hyunzknktdzri34bx26bupkvapehonql)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/pkgconf-1.8.0-ihxi5r
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/zlib-1.2.11-mswjss (external zlib-1.2.11-evtnqzdtuvprwnbd2nljimpzywrhw4uo)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/spdlog-1.9.2-x72vve
==> [email protected] : has external module in ['python/3.9.7']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/python-3.9.7-yj5alh (external python-3.9.7-72xzp3vgzh5b424qevkfruhs3ajzqbiy)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/libyaml-0.2.5-xj2wla
==> [email protected] : has external module in ['flex/2.6.3']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/flex-2.6.3-3mtl7j (external flex-2.6.3-ajxdymqhnzqgeuhvk5nu5zfymzq35n6i)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/bzip2-1.0.8-eu2d6e (external bzip2-1.0.8-3rmq3zxuntsiphthnvty6gtydbmbkwr5)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/libiconv-1.16-hxxlex
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/xz-5.2.5-jzpqn5
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ncurses-6.2-nsve5x (external ncurses-6.2-ams67cxbq5vc7wiay2ndr2ksce2igbfw)
[+] /usr (external tar-1.28-ir7xtblauhq3vtkpjrl7ou3nzevcsi3u)
==> [email protected] : has external module in ['ninja/1.10.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ninja-1.10.2-eh33rh (external ninja-1.10.2-cp3ofspgkdqaqih5vrhotmdwvkozfswp)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/intel-mkl-2020.4.304-rzr3hj (external intel-mkl-2020.4.304-r5cvgrurdnt6y267arsmmmgqi75ouxd2)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/readline-8.1-we2frg (external readline-8.1-3333cwmykvahrsdydir4qeyasic3liq6)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/reportinglib-2.5.6-5fkun4
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/hdf5-1.10.7-utj47f
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-py-1.9.0-y7rfzd
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pyparsing-2.4.7-c7qvw2
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pluggy-0.13.0-mazoio
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-attrs-21.2.0-ascbei
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-mpmath-1.1.0-dzb2mf
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-markupsafe-2.0.1-4ectow
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-iniconfig-1.1.1-kvw3vh
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-toml-0.10.2-w4gddq
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-setuptools-57.4.0-v4z3s5
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pyyaml-5.3.1-ci5oe5
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/libxml2-2.9.12-dnxqn2
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/libsonata-report-1.1-vutuzb
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-packaging-21.0-7iyiyg
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-sympy-1.8-xtett6
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-jinja2-3.0.1-onq3mh
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-cython-0.29.24-h2fsi6
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-coverage-5.5-fudvy5
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-mpi4py-3.1.2-n6q4vf
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/gettext-0.21-iu2b5h
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pytest-6.2.4-22arfs
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-numpy-1.19.5-gjm7kk
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/software/install_gcc-11.2.0-skylake/nmodl-develop-7hnncp
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pytest-cov-2.8.1-q5n7of
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/software/install_nvhpc-22.3-skylake/coreneuron-develop-dqfui2
==> Installing neuron-develop-7rnzemtajzavcycpixj4vl565kzy3d6v
==> No binary for neuron-develop-7rnzemtajzavcycpixj4vl565kzy3d6v found: installing from source
==> Warning: Expected user 904556 to own /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472, but it is owned by 0
==> Applied patch /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/bluebrain/repo-patches/packages/neuron/revert_Import3d_numerical_format.master.patch
==> neuron: Executing phase: 'cmake'
==> neuron: Executing phase: 'build'
==> neuron: Executing phase: 'install'
==> neuron: Successfully installed neuron-develop-7rnzemtajzavcycpixj4vl565kzy3d6v
Fetch: 29.83s. Build: 27m 16.80s. Total: 27m 46.63s.
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/software/install_nvhpc-22.3-skylake/neuron-develop-7rnzem
$ chmod -R g+rX "${SPACK_BUILD}"
$ if [[ ${install_failed} == 1 ]]; then exit 1; fi
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ ccache --cleanup
$ ccache --show-stats --verbose
Summary:
Cache directory: /nvme/bbpcihpcproj12/379284/ccache
Primary config: /nvme/bbpcihpcproj12/379284/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Apr 13 12:29:55 2022
Hits: 702 / 799 (87.86 %)
Direct: 166 / 808 (20.54 %)
Preprocessed: 536 / 636 (84.28 %)
Misses: 97
Direct: 642
Preprocessed: 100
Uncacheable: 119
Primary storage:
Hits: 1175 / 1610 (72.98 %)
Misses: 435
Cache size (GB): 0.46 / 0.51 (90.76 %)
Files: 27301
Uncacheable:
Autoconf compile/link: 7
Called for linking: 96
Called for preprocessing: 1
Compilation failed: 3
No input file: 6
Preprocessing failed: 6
$ tar -C "${CCACHE_DIR}" -cf "${CI_PROJECT_DIR}/ccache.tar" .
$ fi
$ touch ${SPACK_STAGE_DIR}/spack-configure-args.txt
$ cp ${SPACK_STAGE_DIR}/spack-{build-env,build-out,configure-args}.txt ${CI_PROJECT_DIR}/
$ echo "SPACK_BUILD_DIR=${SPACK_BUILD_DIR}" > ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_FULL_SPEC=${SPACK_FULL_SPEC}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_SOURCE_DIR=${SPACK_SOURCE_DIR}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_INSTALLED_HASH=${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB=^/${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ num_failures=$(module load unstable python-dev; python -c "from lxml import etree; xml = etree.parse('${CI_PROJECT_DIR}/install.xml'); print(sum(1 for _ in xml.getroot().iter('failure')) + sum(1 for _ in xml.getroot().iter('error')))")
Autoloading python/3.9.7
Autoloading py-virtualenv/16.7.6
Autoloading py-scipy/1.7.1
Autoloading py-numpy/1.19.5
Autoloading py-mpi4py/3.1.2
Autoloading hpe-mpi/2.25.hmpt
Autoloading py-h5py/3.4.0
Autoloading hdf5/1.10.7
$ if [[ ${num_failures} > 0 ]]; then exit ${num_failures}; fi
section_end:1649845798:step_script section_start:1649845798:archive_cache Saving cache for successful job
Creating cache build:neuron:nmodl:nvhpc:omp-8...
Runtime platform  arch=amd64 os=linux pid=121913 revision=58ba2b95 version=14.2.0
ccache.tar: found 1 matching files and directories
Uploading cache.zip to https://bbpobjectstorage.epfl.ch/gitlab-runners-cache/project/112/build%3Aneuron%3Anmodl%3Anvhpc%3Aomp-8
Created cache
section_end:1649845833:archive_cache section_start:1649845833:upload_artifacts_on_success Uploading artifacts for successful job
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=124096 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
initial_environment.env: found 1 matching files and directories
spack-build-env.txt: found 1 matching files and directories
spack-build-out.txt: found 1 matching files and directories
spack-configure-args.txt: found 1 matching files and directories
Uploading artifacts as "archive" to coordinator... ok id=211026 responseStatus=201 Created token=Pm8KebQh
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=124123 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
Uploading artifacts as "junit" to coordinator... ok id=211026 responseStatus=201 Created token=Pm8KebQh
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=124152 revision=58ba2b95 version=14.2.0
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
Uploading artifacts as "dotenv" to coordinator... ok id=211026 responseStatus=201 Created token=Pm8KebQh
section_end:1649845835:upload_artifacts_on_success section_start:1649845835:cleanup_file_variables Cleaning up project directory and file based variables
section_end:1649845835:cleanup_file_variables Job succeeded
Running with gitlab-runner 14.3.2 (e0218c92)
 on BB5 map runner pnPo3yJy
section_start:1649843396:resolve_secrets Resolving secrets
section_end:1649843396:resolve_secrets section_start:1649843396:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 14.3.2, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor156994645, slurm job id
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 48472, build ref 9a466a6e2a1c807e4d6045f3fc0b1fbaf51370b6, job ID 211018
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472, optional exclusive flag , optional cpus per task flag --cpus-per-task=8, optional qos flag
A slurm job will be created with name GL_J211018_PROD_P112_CP3_C2
Job parameters: memory=76G, cpus_per_task=8, duration=1:00:00, constraint=cpu ntasks=2 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
sbatch: INFO: Job specifies cpu constraint, setting --constraint=[skl|clx]
Submitted batch job 379121
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=2 --cpus-per-task=8 --mem=76G --job-name=GL_J211018_PROD_P112_CP3_C2 -C cpu --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472 --ntasks=2 --jobid=379121 --cpus-per-task=8 --mem=76G
section_end:1649843398:prepare_executor section_start:1649843398:prepare_script Preparing environment
Running on r2i2n20 via bbpv1.epfl.ch...
section_end:1649843400:prepare_script section_start:1649843400:get_sources Getting source from Git repository
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1649843401:get_sources section_start:1649843401:restore_cache Restoring cache
Checking cache for build:nmodl-8...
Runtime platform  arch=amd64 os=linux pid=231568 revision=58ba2b95 version=14.2.0
cache.zip is up to date 
Successfully extracted cache
section_end:1649843406:restore_cache section_start:1649843406:download_artifacts Downloading artifacts
Downloading artifacts for spack_setup (211016)...
Runtime platform  arch=amd64 os=linux pid=232053 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=211016 responseStatus=200 OK token=EkaigVme
section_end:1649843407:download_artifacts section_start:1649843407:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
$ SPACK_BUILD="${PWD}/spack-build"
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ export SPACK_USER_CONFIG_PATH=${PWD}/spack-config
$ mkdir ${SPACK_USER_CONFIG_PATH}
$ cat > ${SPACK_USER_CONFIG_PATH}/config.yaml << END_SCRIPT # collapsed multi-line command
$ spack config blame config
--- config:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211018/spack-config/config.yaml:2 build_stage:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211018/spack-config/config.yaml:3 - /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211018/spack-build
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211018/spack-config/config.yaml:4 source_cache: /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211018/spack-source-cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/config.yaml:2 ccache: True
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:2 install_tree:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:3 root: $user_cache_path/software
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:4 projections:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:5 all: install_{compiler.name}-{compiler.version}-{target}/{name}-{version}-{hash:6}
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:6 module_roots:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:7 tcl: $user_cache_path/modules
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:9 environments:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:10 root: $user_cache_path/environments
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:88 concretization: separately
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:12 build_jobs: 8
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:18 extensions:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:18 - $spack/bluebrain/spack-scripting
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:19 # This is the path to the root of the Spack install tree.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:19 # You can use $spack here to refer to the root of the spack instance.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:37 template_dirs:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:37 - $spack/share/spack/templates
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Temporary locations Spack can try to use for builds.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Recommended options are given below.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Builds can be faster in temporary directories on some (e.g., HPC) systems.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Specifying `$tempdir` will ensure use of the default temporary directory
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # (i.e., ``$TMP` or ``$TMPDIR``).
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # Another option that prevents conflicts and potential permission issues is
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # to specify `$user_cache_path/stage`, which ensures each user builds in their
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # home directory.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # A more traditional path uses the value of `$spack/var/spack/stage`, which
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # builds directly inside Spack's instance without staging them in a
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # temporary space. Problems with specifying a path inside a Spack instance
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # are that it precludes its use as a system package and its ability to be
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # pip installable.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # In any case, if the username is not already in the path, Spack will append
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # the value of `$user` in an attempt to avoid potential conflicts between
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # users in shared temporary spaces.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # The build stage can be purged with `spack clean --stage` and
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # `spack clean -a`, so it is important that the specified directory uniquely
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:38 # identifies Spack staging to avoid accidentally wiping out non-Spack work.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:73 test_stage: $user_cache_path/test
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:82 misc_cache: $user_cache_path/cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:94 connect_timeout: 10
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:99 verify_ssl: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:107 suppress_gpg_warnings: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:113 install_missing_compilers: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:118 checksum: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:123 deprecated: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:128 dirty: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:136 build_language: C
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:145 locks: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:150 url_fetch_method: urllib
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:176 concretizer: clingo
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:183 db_lock_timeout: 300
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:191 package_lock_timeout: null
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:197 shared_linking: rpath
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:202 allow_sgid: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016/spack/etc/spack/defaults/config.yaml:207 terminal_title: False
_builtin debug: False
$ export XDG_CONFIG_HOME=${CI_BUILDS_DIR}/J${CI_JOB_ID}_local_config
$ echo "Configuring git to use CI_JOB_TOKEN to access [email protected] (${XDG_CONFIG_HOME})"
Configuring git to use CI_JOB_TOKEN to access [email protected] (/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472/J211018_local_config)
$ mkdir -p "${XDG_CONFIG_HOME}/git"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = [email protected]:" > "${XDG_CONFIG_HOME}/git/config"
$ cat "${XDG_CONFIG_HOME}/git/config"
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = [email protected]:
$ SPACK_FULL_SPEC="${SPACK_PACKAGE}${SPACK_PACKAGE_COMPILER:+%}${SPACK_PACKAGE_COMPILER} ${SPACK_PACKAGE_SPEC} ${SPACK_PACKAGE_DEPENDENCIES} ${SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB}"
$ echo "Preparing to install ${SPACK_FULL_SPEC}"
Preparing to install nmodl%gcc ~legacy-unit
$ JSON_SPEC=$(spack spec --json ${SPACK_FULL_SPEC})
$ SPACK_INSTALLED_HASH=$(module load unstable python; echo "${JSON_SPEC}" | python -c "import json, sys; print(json.loads(sys.stdin.read())[\"spec\"][\"nodes\"][0][\"hash\"])")
$ echo "Determined its hash will be ${SPACK_INSTALLED_HASH}"
Determined its hash will be 7hnncpjzlyqwesmjkszimpr4c2qteocr
$ SPACK_STAGE_DIR=${SPACK_BUILD}/spack-stage-${SPACK_PACKAGE}-develop-${SPACK_INSTALLED_HASH}
$ SPACK_BUILD_DIR=${SPACK_STAGE_DIR}/spack-build-${SPACK_INSTALLED_HASH:0:7}
$ SPACK_SOURCE_DIR=${SPACK_STAGE_DIR}/spack-src
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ module load unstable ccache
$ export CCACHE_BASEDIR=$(realpath -P ${CI_BUILDS_DIR})
$ echo CCACHE_BASEDIR=${CCACHE_BASEDIR}
CCACHE_BASEDIR=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472
$ export CCACHE_MAXSIZE=512M
$ export CCACHE_DIR="${TMPDIR}/ccache"
$ mkdir -p ${CCACHE_DIR}
$ if [ -f ${CI_PROJECT_DIR}/ccache.tar ]; then
$ tar -C "${CCACHE_DIR}" -xf "${CI_PROJECT_DIR}/ccache.tar"
$ fi
$ ccache --zero-stats
Statistics zeroed
$ ccache --show-stats --verbose
Summary:
Cache directory: /gpfs/bbp.cscs.ch/ssd/slurmTmpFS/bbpcihpcproj12/379121/ccache
Primary config: /gpfs/bbp.cscs.ch/ssd/slurmTmpFS/bbpcihpcproj12/379121/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Apr 13 11:50:32 2022
Hits: 0 / 0
Direct: 0 / 0
Preprocessed: 0 / 0
Misses: 0
Direct: 0
Preprocessed: 0
Primary storage:
Hits: 0 / 0
Misses: 0
Cache size (GB): 0.42 / 0.51 (82.69 %)
Files: 965
$ fi
$ module load unstable git ${SPACK_EXTRA_MODULES}
$ spack uninstall -y --dependents /${SPACK_INSTALLED_HASH} || true
==> Error: No installed spec matches the hash: '7hnncpjzlyqwesmjkszimpr4c2qteocr'
$ spack spec -Il ${SPACK_FULL_SPEC}
Input spec
--------------------------------
- nmodl%gcc~legacy-unit
Concretized
--------------------------------
- 7hnncpj nmodl@develop%[email protected]~ipo~legacy-unit~llvm~llvm_cuda~python build_type=RelWithDebInfo arch=linux-rhel7-skylake
- cppb7al ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] 2qmvlfy ^[email protected]%[email protected]~doc+ncurses+openssl+ownlibs~qt build_type=Release arch=linux-rhel7-skylake
- 4bt76dp ^[email protected]%[email protected]+lex~nls arch=linux-rhel7-skylake
- utrxbc3 ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] onq3mhg ^[email protected]%[email protected]~i18n arch=linux-rhel7-skylake
[^] 4ectow5 ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] v4z3s5e ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] 72xzp3v ^[email protected]%[email protected]+bz2+ctypes+dbm~debug+libxml2+lzma~nis~optimizations+pic+pyexpat+pythoncmd+readline+shared+sqlite3+ssl~tix~tkinter~ucs4+uuid+zlib patches=0d98e93189bc278fbc37a50ed7f183bd8aaf249a8e1670a465f0db6bb4f8cf87,4c2457325f2b608b1b6a2c63087df8c26e07db3e3d493caf36a56f0ecf6fb768,f2fd060afc4b4618fe8104c4c5d771f36dc55b1db5a4623785a4ea707ec72fb4 arch=linux-rhel7-skylake
[^] 22arfs4 ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ascbeii ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] kvw3vhm ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] 7iyiygo ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] c7qvw2q ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] mazoiox ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ciusbmc ^[email protected]%[email protected]+toml arch=linux-rhel7-skylake
[^] hmcew6w ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] sxd7srs ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] y7rfzdj ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] w4gddqx ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] ci5oe5b ^[email protected]%[email protected]+libyaml arch=linux-rhel7-skylake
[^] xj2wlac ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] xtett6q ^[email protected]%[email protected] arch=linux-rhel7-skylake
[^] dzb2mfs ^[email protected]%[email protected] arch=linux-rhel7-skylake
$ spack ${SPACK_INSTALL_EXTRA_FLAGS} install -j${SLURM_CPUS_PER_TASK} --log-format=junit --log-file=${CI_PROJECT_DIR}/install.xml --keep-stage ${SPACK_FULL_SPEC} || install_failed=1
==> [email protected] : has external module in ['bison/3.8.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/bison-3.8.2-ej2cew (external bison-3.8.2-cppb7alftvhxbedsuxqv72z2thjuoizw)
==> [email protected] : has external module in ['cmake/3.21.4']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/cmake-3.21.4-cdyb7k (external cmake-3.21.4-2qmvlfyylrv3t5ankluyr5cqey2nlfzd)
==> [email protected] : has external module in ['flex/2.6.3']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/flex-2.6.3-3mtl7j (external flex-2.6.3-4bt76dpxbix6ep4qtz3mv5i2iddilv53)
==> [email protected] : has external module in ['ninja/1.10.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ninja-1.10.2-eh33rh (external ninja-1.10.2-utrxbc3aohnru5eynalc3hyv4ca4jqte)
==> [email protected] : has external module in ['python/3.9.7']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/python-3.9.7-yj5alh (external python-3.9.7-72xzp3vgzh5b424qevkfruhs3ajzqbiy)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/libyaml-0.2.5-xj2wla
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-mpmath-1.1.0-dzb2mf
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-setuptools-57.4.0-v4z3s5
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pyyaml-5.3.1-ci5oe5
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-sympy-1.8-xtett6
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-toml-0.10.2-w4gddq
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-attrs-21.2.0-ascbei
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pip-21.1.2-sxd7sr
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pyparsing-2.4.7-c7qvw2
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-iniconfig-1.1.1-kvw3vh
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-markupsafe-2.0.1-4ectow
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-tomli-1.2.1-hmcew6
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-packaging-21.0-7iyiyg
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-jinja2-3.0.1-onq3mh
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-setuptools-scm-6.3.2-ciusbm
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pluggy-0.13.0-mazoio
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-py-1.9.0-y7rfzd
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pytest-6.2.4-22arfs
==> Installing nmodl-develop-7hnncpjzlyqwesmjkszimpr4c2qteocr
==> No binary for nmodl-develop-7hnncpjzlyqwesmjkszimpr4c2qteocr found: installing from source
==> Warning: Expected user 904556 to own /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472, but it is owned by 0
==> No patches needed for nmodl
==> nmodl: Executing phase: 'cmake'
==> nmodl: Executing phase: 'build'
==> nmodl: Executing phase: 'install'
==> nmodl: Successfully installed nmodl-develop-7hnncpjzlyqwesmjkszimpr4c2qteocr
Fetch: 32.37s. Build: 2m 10.79s. Total: 2m 43.17s.
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/software/install_gcc-11.2.0-skylake/nmodl-develop-7hnncp
$ chmod -R g+rX "${SPACK_BUILD}"
$ if [[ ${install_failed} == 1 ]]; then exit 1; fi
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ ccache --cleanup
$ ccache --show-stats --verbose
Summary:
Cache directory: /gpfs/bbp.cscs.ch/ssd/slurmTmpFS/bbpcihpcproj12/379121/ccache
Primary config: /gpfs/bbp.cscs.ch/ssd/slurmTmpFS/bbpcihpcproj12/379121/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Apr 13 11:53:56 2022
Hits: 133 / 135 (98.52 %)
Direct: 31 / 135 (22.96 %)
Preprocessed: 102 / 104 (98.08 %)
Misses: 2
Direct: 104
Preprocessed: 2
Uncacheable: 27
Primary storage:
Hits: 164 / 270 (60.74 %)
Misses: 106
Cache size (GB): 0.43 / 0.51 (83.36 %)
Files: 969
Uncacheable:
Called for linking: 26
No input file: 1
$ tar -C "${CCACHE_DIR}" -cf "${CI_PROJECT_DIR}/ccache.tar" .
$ fi
$ touch ${SPACK_STAGE_DIR}/spack-configure-args.txt
$ cp ${SPACK_STAGE_DIR}/spack-{build-env,build-out,configure-args}.txt ${CI_PROJECT_DIR}/
$ echo "SPACK_BUILD_DIR=${SPACK_BUILD_DIR}" > ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_FULL_SPEC=${SPACK_FULL_SPEC}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_SOURCE_DIR=${SPACK_SOURCE_DIR}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_INSTALLED_HASH=${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB=^/${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ num_failures=$(module load unstable python-dev; python -c "from lxml import etree; xml = etree.parse('${CI_PROJECT_DIR}/install.xml'); print(sum(1 for _ in xml.getroot().iter('failure')) + sum(1 for _ in xml.getroot().iter('error')))")
Autoloading python/3.9.7
Autoloading py-virtualenv/16.7.6
Autoloading py-scipy/1.7.1
Autoloading py-numpy/1.19.5
Autoloading py-mpi4py/3.1.2
Autoloading hpe-mpi/2.25.hmpt
Autoloading py-h5py/3.4.0
Autoloading hdf5/1.10.7
$ if [[ ${num_failures} > 0 ]]; then exit ${num_failures}; fi
section_end:1649843638:step_script section_start:1649843638:archive_cache Saving cache for successful job
Creating cache build:nmodl-8...
Runtime platform  arch=amd64 os=linux pid=242840 revision=58ba2b95 version=14.2.0
ccache.tar: found 1 matching files and directories
Uploading cache.zip to https://bbpobjectstorage.epfl.ch/gitlab-runners-cache/project/112/build%3Anmodl-8
Created cache
section_end:1649843657:archive_cache section_start:1649843657:upload_artifacts_on_success Uploading artifacts for successful job
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=243038 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
initial_environment.env: found 1 matching files and directories
spack-build-env.txt: found 1 matching files and directories
spack-build-out.txt: found 1 matching files and directories
spack-configure-args.txt: found 1 matching files and directories
Uploading artifacts as "archive" to coordinator... ok id=211018 responseStatus=201 Created token=8Sk4MWSK
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=243078 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
Uploading artifacts as "junit" to coordinator... ok id=211018 responseStatus=201 Created token=8Sk4MWSK
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=243120 revision=58ba2b95 version=14.2.0
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
Uploading artifacts as "dotenv" to coordinator... ok id=211018 responseStatus=201 Created token=8Sk4MWSK
section_end:1649843659:upload_artifacts_on_success section_start:1649843659:cleanup_file_variables Cleaning up project directory and file based variables
section_end:1649843659:cleanup_file_variables Job succeeded
Running with gitlab-runner 14.3.2 (e0218c92)
 on BB5 map runner pnPo3yJy
section_start:1649843322:resolve_secrets Resolving secrets
section_end:1649843322:resolve_secrets section_start:1649843322:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 14.3.2, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor045036810, slurm job id
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 48472, build ref 9a466a6e2a1c807e4d6045f3fc0b1fbaf51370b6, job ID 211016
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472, optional exclusive flag , optional cpus per task flag --cpus-per-task=6, optional qos flag
A slurm job will be created with name GL_J211016_PROD_P112_CP3_C2
Job parameters: memory=30750M, cpus_per_task=6, duration=1:00:00, constraint=cpu ntasks=1 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
sbatch: INFO: Job specifies cpu constraint, setting --constraint=[skl|clx]
Submitted batch job 379108
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=1 --cpus-per-task=6 --mem=30750M --job-name=GL_J211016_PROD_P112_CP3_C2 -C cpu --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472 --ntasks=1 --jobid=379108 --cpus-per-task=6 --mem=30750M
section_end:1649843323:prepare_executor section_start:1649843323:prepare_script Preparing environment
Running on r2i2n22 via bbpv1.epfl.ch...
section_end:1649843324:prepare_script section_start:1649843324:get_sources Getting source from Git repository
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1649843325:get_sources section_start:1649843325:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
$ if [[ -n "${SPACK_ENV_FILE_URL}" && "${PARSE_GITHUB_PR_DESCRIPTIONS,,}" == "true" ]]; then
$ cat > parse_description.py << END_SCRIPT # collapsed multi-line command
$ cat parse_description.py
import os
import re
import requests
pr_info = requests.get("https://api.github.com/repos/{}/pulls/{}".format(
os.environ['CI_EXTERNAL_PULL_REQUEST_TARGET_REPOSITORY'],
os.environ['CI_EXTERNAL_PULL_REQUEST_IID']),
headers={'Accept': 'application/vnd.github.v3+json'})
pr_body = pr_info.json()["body"]
# match something like NEURON_BRANCH=foo/bar
pat = re.compile('^([A-Z0-9_]+)_([A-Z]+)=([A-Z0-9\-\_\/\+]+)$', re.IGNORECASE)
def parse_term(m):
ref_type = m.group(2).lower()
if ref_type not in {'branch', 'tag', 'ref'}: return
print(m.group(1).upper() + '_' + ref_type.upper() + '=' + m.group(3))
if pr_body is not None:
for pr_body_line in pr_body.splitlines():
if not pr_body_line.startswith('CI_BRANCHES:'): continue
for config_term in pr_body_line[12:].split(','):
pat.sub(parse_term, config_term)
$ (module load unstable python-dev; python parse_description.py) > input_variables.env
Autoloading python/3.9.7
Autoloading py-virtualenv/16.7.6
Autoloading py-scipy/1.7.1
Autoloading py-numpy/1.19.5
Autoloading py-mpi4py/3.1.2
Autoloading hpe-mpi/2.25.hmpt
Autoloading py-h5py/3.4.0
Autoloading hdf5/1.10.7
$ else
$ cat input_variables.env
NEURON_BRANCH=master
NMODL_BRANCH=master
SPACK_BRANCH=develop
$ for var_to_unset in $(sed 's/^\(.*\?\)_\(BRANCH\|COMMIT\|TAG\)=.*$/\1_BRANCH\n\1_COMMIT\n\1_TAG/' input_variables.env); do # collapsed multi-line command
Unsetting NEURON_BRANCH
Unsetting NMODL_BRANCH
Unsetting SPACK_BRANCH
$ set -o allexport
$ . input_variables.env
$ set +o allexport
$ unset MODULEPATH
$ . /gpfs/bbp.cscs.ch/ssd/apps/bsd/${SPACK_DEPLOYMENT_SUFFIX}/config/modules.sh
$ echo "MODULEPATH=${MODULEPATH}" > spack_clone_variables.env
$ echo Preparing to clone Spack into ${PWD}
Preparing to clone Spack into /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211016
$ if [[ -z "${SPACK_BRANCH}" && ( -n "${SPACK_COMMIT}" || -n "${SPACK_TAG}" ) ]]; then
$ echo Checking out the ${SPACK_BRANCH} of Spack...
Checking out the develop of Spack...
$ module load unstable git
$ git clone -c feature.manyFiles=true --depth 1 --single-branch --branch ${SPACK_BRANCH} ${SPACK_URL} spack
Cloning into 'spack'...
Updating files: 17% (1577/9135) Updating files: 18% (1645/9135) Updating files: 19% (1736/9135) Updating files: 20% (1827/9135) Updating files: 21% (1919/9135) Updating files: 22% (2010/9135) Updating files: 23% (2102/9135) Updating files: 24% (2193/9135) Updating files: 25% (2284/9135) Updating files: 26% (2376/9135) Updating files: 27% (2467/9135) Updating files: 27% (2476/9135) Updating files: 28% (2558/9135) Updating files: 29% (2650/9135) Updating files: 30% (2741/9135) Updating files: 31% (2832/9135) Updating files: 32% (2924/9135) Updating files: 33% (3015/9135) Updating files: 34% (3106/9135) Updating files: 35% (3198/9135) Updating files: 36% (3289/9135) Updating files: 37% (3380/9135) Updating files: 38% (3472/9135) Updating files: 39% (3563/9135) Updating files: 40% (3654/9135) Updating files: 41% (3746/9135) Updating files: 42% (3837/9135) Updating files: 43% (3929/9135) Updating files: 44% (4020/9135) Updating files: 44% (4056/9135) Updating files: 45% (4111/9135) Updating files: 46% (4203/9135) Updating files: 47% (4294/9135) Updating files: 48% (4385/9135) Updating files: 49% (4477/9135) Updating files: 50% (4568/9135) Updating files: 51% (4659/9135) Updating files: 52% (4751/9135) Updating files: 53% (4842/9135) Updating files: 54% (4933/9135) Updating files: 55% (5025/9135) Updating files: 56% (5116/9135) Updating files: 57% (5207/9135) Updating files: 58% (5299/9135) Updating files: 59% (5390/9135) Updating files: 59% (5453/9135) Updating files: 60% (5481/9135) Updating files: 61% (5573/9135) Updating files: 62% (5664/9135) Updating files: 63% (5756/9135) Updating files: 64% (5847/9135) Updating files: 65% (5938/9135) Updating files: 66% (6030/9135) Updating files: 67% (6121/9135) Updating files: 68% (6212/9135) Updating files: 69% (6304/9135) Updating files: 70% (6395/9135) Updating files: 71% (6486/9135) Updating files: 72% (6578/9135) Updating files: 72% (6654/9135) Updating files: 73% (6669/9135) Updating files: 74% (6760/9135) Updating files: 75% (6852/9135) Updating files: 76% (6943/9135) Updating files: 77% (7034/9135) Updating files: 78% (7126/9135) Updating files: 79% (7217/9135) Updating files: 80% (7308/9135) Updating files: 81% (7400/9135) Updating files: 82% (7491/9135) Updating files: 83% (7583/9135) Updating files: 84% (7674/9135) Updating files: 85% (7765/9135) Updating files: 86% (7857/9135) Updating files: 87% (7948/9135) Updating files: 88% (8039/9135) Updating files: 89% (8131/9135) Updating files: 90% (8222/9135) Updating files: 91% (8313/9135) Updating files: 91% (8354/9135) Updating files: 92% (8405/9135) Updating files: 93% (8496/9135) Updating files: 94% (8587/9135) Updating files: 95% (8679/9135) Updating files: 96% (8770/9135) Updating files: 97% (8861/9135) Updating files: 98% (8953/9135) Updating files: 99% (9044/9135) Updating files: 100% (9135/9135) Updating files: 100% (9135/9135), done.
$ export SPACK_ROOT=${PWD}/spack
$ export SPACK_USER_CACHE_PATH="${CI_BUILDS_DIR}"
$ export SPACK_SYSTEM_CONFIG_PATH="/gpfs/bbp.cscs.ch/ssd/apps/bsd/${SPACK_DEPLOYMENT_SUFFIX}/config"
$ echo "SPACK_ROOT=${SPACK_ROOT}" >> spack_clone_variables.env
$ echo "SPACK_USER_CACHE_PATH=${SPACK_USER_CACHE_PATH}" >> spack_clone_variables.env
$ echo "SPACK_SYSTEM_CONFIG_PATH=${SPACK_SYSTEM_CONFIG_PATH}" >> spack_clone_variables.env
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ export XDG_CONFIG_HOME=${CI_BUILDS_DIR}/J${CI_JOB_ID}_local_config
$ echo "Configuring git to use CI_JOB_TOKEN to access [email protected] (${XDG_CONFIG_HOME})"
Configuring git to use CI_JOB_TOKEN to access [email protected] (/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472/J211016_local_config)
$ mkdir -p "${XDG_CONFIG_HOME}/git"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = [email protected]:" > "${XDG_CONFIG_HOME}/git/config"
$ cat "${XDG_CONFIG_HOME}/git/config"
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = [email protected]:
$ env -0 | sed -nz '/^CUSTOM_ENV_/d;/^[^=]\+_\(BRANCH\|COMMIT\|TAG\)=.\+/p' | xargs -0t spack configure-pipeline --ignore-packages CI_BUILD CI_COMMIT CI_DEFAULT GITLAB_PIPELINES SPACK ${SPACK_SETUP_IGNORE_PACKAGE_VARIABLES} --write-commit-file=commit-mapping.env
spack configure-pipeline --ignore-packages CI_BUILD CI_COMMIT CI_DEFAULT GITLAB_PIPELINES SPACK --write-commit-file=commit-mapping.env CI_COMMIT_BRANCH=olupton/random123-as-openacc GITLAB_PIPELINES_BRANCH=main NEURON_BRANCH=master NMODL_BRANCH=master SPACK_BRANCH=develop CI_DEFAULT_BRANCH=master CORENEURON_COMMIT=9a466a6e2a1c807e4d6045f3fc0b1fbaf51370b6
==> CI_COMMIT: ignoring CI_COMMIT_BRANCH=olupton/random123-as-openacc
==> GITLAB_PIPELINES: ignoring GITLAB_PIPELINES_BRANCH=main
==> SPACK: ignoring SPACK_BRANCH=develop
==> CI_DEFAULT: ignoring CI_DEFAULT_BRANCH=master
==> neuron: resolved branch master to 991863f0c0488f17c0b489ab5bcc3e21cb3b3326
==> nmodl: resolved branch master to cde5dbf02fe93bb1b0b8cc9b5a6ebe0747ab8720
==> neuron@develop: remove branch/commit/tag
==> neuron@develop: use commit="991863f0c0488f17c0b489ab5bcc3e21cb3b3326"
==> neuron@develop: add preferred=True
==> nmodl@develop: remove branch/commit/tag
==> nmodl@develop: use commit="cde5dbf02fe93bb1b0b8cc9b5a6ebe0747ab8720"
==> nmodl@develop: add preferred=True
==> coreneuron@develop: remove branch/commit/tag
==> coreneuron@develop: use commit="9a466a6e2a1c807e4d6045f3fc0b1fbaf51370b6"
==> coreneuron@develop: add preferred=True
$ (cd "${SPACK_ROOT}" && git diff)
diff --git a/bluebrain/repo-bluebrain/packages/coreneuron/package.py b/bluebrain/repo-bluebrain/packages/coreneuron/package.py
index 88e0cc5..2d0b97d 100644
--- a/bluebrain/repo-bluebrain/packages/coreneuron/package.py
+++ b/bluebrain/repo-bluebrain/packages/coreneuron/package.py
@@ -20,7 +20,7 @@ class Coreneuron(CMakePackage):
# This simplifies testing the gitlab-pipelines repository:
git = "[email protected]:hpc/coreneuron.git"
- version('develop', branch='master')
+ version('develop', preferred=True, commit='9a466a6e2a1c807e4d6045f3fc0b1fbaf51370b6') # old: branch='master'
# 1.0.1 > 1.0.0.20210519 > 1.0 as far as Spack is concerned
version('1.0.0.20220304', commit='2d08705')
version('1.0.0.20220218', commit='102ebde')
diff --git a/bluebrain/repo-bluebrain/packages/nmodl/package.py b/bluebrain/repo-bluebrain/packages/nmodl/package.py
index 9e6c5ce..544ef43 100644
--- a/bluebrain/repo-bluebrain/packages/nmodl/package.py
+++ b/bluebrain/repo-bluebrain/packages/nmodl/package.py
@@ -14,7 +14,7 @@ class Nmodl(CMakePackage):
git = "https://github.com/BlueBrain/nmodl.git"
# 0.3.1 > 0.3.0.20220110 > 0.3.0 > 0.3b > 0.3 to Spack
- version('develop', branch='master', submodules=True)
+ version('develop', preferred=True, commit='cde5dbf02fe93bb1b0b8cc9b5a6ebe0747ab8720', submodules=True) # old: branch='master'
version('llvm', branch='llvm', submodules=True)
# For deployment; [email protected]%[email protected] doesn't build with eigen/intrinsics errors
version('0.3.0.20220110', commit='9e0a6f260ac2e6fad068a39ea3bdf7aa7a6f4ee0')
diff --git a/bluebrain/repo-patches/packages/neuron/package.py b/bluebrain/repo-patches/packages/neuron/package.py
index 3f44d8c..e5c7785 100644
--- a/bluebrain/repo-patches/packages/neuron/package.py
+++ b/bluebrain/repo-patches/packages/neuron/package.py
@@ -34,7 +34,7 @@ class Neuron(CMakePackage):
# Patch for recent CMake versions that don't identify NVHPC as PGI
patch("patch-v800-cmake-nvhpc.patch", when="@8.0.0%nvhpc^[email protected]:")
- version("develop", branch="master")
+ version('develop', preferred=True, commit='991863f0c0488f17c0b489ab5bcc3e21cb3b3326') # old: branch="master"
version("8.0.2", tag="8.0.2")
version("8.0.1", tag="8.0.1")
version("8.0.0", tag="8.0.0")
$ cat commit-mapping.env
NEURON_COMMIT=991863f0c0488f17c0b489ab5bcc3e21cb3b3326
NMODL_COMMIT=cde5dbf02fe93bb1b0b8cc9b5a6ebe0747ab8720
CORENEURON_COMMIT=9a466a6e2a1c807e4d6045f3fc0b1fbaf51370b6
$ echo "SPACK_BRANCH=${SPACK_BRANCH}" >> commit-mapping.env
$ echo "SPACK_DEPLOYMENT_SUFFIX=${SPACK_DEPLOYMENT_SUFFIX}" >> commit-mapping.env
$ cat commit-mapping.env >> spack_clone_variables.env
$ spack spec -IL ninja
Input spec
--------------------------------
- ninja
Concretized
--------------------------------
==> Bootstrapping clingo from pre-built binaries
- utrxbc3aohnru5eynalc3hyv4ca4jqte [email protected]%[email protected] arch=linux-rhel7-skylake
$ echo "SPACK_SETUP_COMMIT_MAPPING_URL=${CI_API_V4_URL}/projects/${CI_PROJECT_ID}/jobs/${CI_JOB_ID}/artifacts/commit-mapping.env" >> spack_clone_variables.env
$ spack config --scope site add "config:ccache:true"
$ echo "SPACK_USE_CCACHE=true" >> spack_clone_variables.env
section_end:1649843392:step_script section_start:1649843392:upload_artifacts_on_success Uploading artifacts for successful job
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=151785 revision=58ba2b95 version=14.2.0
commit-mapping.env: found 1 matching files and directories
input_variables.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
Uploading artifacts as "archive" to coordinator... ok id=211016 responseStatus=201 Created token=EkaigVme
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=151810 revision=58ba2b95 version=14.2.0
spack_clone_variables.env: found 1 matching files and directories
Uploading artifacts as "dotenv" to coordinator... ok id=211016 responseStatus=201 Created token=EkaigVme
section_end:1649843394:upload_artifacts_on_success section_start:1649843394:cleanup_file_variables Cleaning up project directory and file based variables
section_end:1649843394:cleanup_file_variables Job succeeded
Running with gitlab-runner 14.3.2 (e0218c92)
 on BB5 map runner pnPo3yJy
section_start:1649843541:resolve_secrets Resolving secrets
section_end:1649843541:resolve_secrets section_start:1649843541:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 14.3.2, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor702901559, slurm job id
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 48472, build ref 9a466a6e2a1c807e4d6045f3fc0b1fbaf51370b6, job ID 211034
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472, optional exclusive flag , optional cpus per task flag --cpus-per-task=1, optional qos flag
A slurm job will be created with name GL_J211034_PROD_P112_CP7_C16
Job parameters: memory=30750M, cpus_per_task=1, duration=1:00:00, constraint=cpu ntasks=8 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
sbatch: INFO: Job specifies cpu constraint, setting --constraint=[skl|clx]
Submitted batch job 379164
job state: PD
job state: PD
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=8 --cpus-per-task=1 --mem=30750M --job-name=GL_J211034_PROD_P112_CP7_C16 -C cpu --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472 --ntasks=8 --jobid=379164 --cpus-per-task=1 --mem=30750M
section_end:1649843544:prepare_executor section_start:1649843544:prepare_script Preparing environment
Running on r1i7n21 via bbpv1.epfl.ch...
section_end:1649843547:prepare_script section_start:1649843547:get_sources Getting source from Git repository
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1649843548:get_sources section_start:1649843548:download_artifacts Downloading artifacts
Downloading artifacts for build:coreneuron:mod2c:intel (211023)...
Runtime platform  arch=amd64 os=linux pid=8128 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=211023 responseStatus=200 OK token=SAThh_5K
section_end:1649843549:download_artifacts section_start:1649843549:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ export CTEST_PARALLEL_LEVEL=${SLURM_TASKS_PER_NODE}
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ cd ${SPACK_BUILD_DIR}
$ export BOOST_TEST_COLOR_OUTPUT=no
$ i_am_a_failure=0
$ spack build-env ${SPACK_FULL_SPEC} -- ctest --output-on-failure -T Test || i_am_a_failure=1
Site: r2i2n21
Build name: Linux-icpc
Create new tag: 20220413-0952 - Experimental
Test project /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211023/spack-build/spack-stage-coreneuron-develop-hsyhen5wbojr4dsx5rf32wndj3dpm5nb/spack-build-hsyhen5
Start 1: cmd_interface_test
Start 2: interleave_info_constructor_test
Start 3: alignment_test
Start 4: queuing_test
Start 5: lfp_test
Start 6: ring_TEST
Start 17: reporting_1
1/17 Test #1: cmd_interface_test ................. Passed 1.37 sec
2/17 Test #2: interleave_info_constructor_test ... Passed 1.53 sec
Start 7: ring_binqueue_TEST
3/17 Test #6: ring_TEST .......................... Passed 1.74 sec
Start 8: ring_multisend_TEST
4/17 Test #3: alignment_test ..................... Passed 1.83 sec
5/17 Test #17: reporting_1 ........................ Passed 1.92 sec
Start 9: ring_spike_buffer_TEST
6/17 Test #4: queuing_test ....................... Passed 2.05 sec
7/17 Test #5: lfp_test ........................... Passed 2.27 sec
Start 10: ring_permute1_TEST
8/17 Test #7: ring_binqueue_TEST ................. Passed 1.13 sec
Start 11: ring_permute2_TEST
9/17 Test #8: ring_multisend_TEST ................ Passed 1.11 sec
Start 12: ring_gap_TEST
10/17 Test #10: ring_permute1_TEST ................. Passed 1.26 sec
Start 13: ring_gap_binqueue_TEST
11/17 Test #11: ring_permute2_TEST ................. Passed 2.02 sec
Start 14: ring_gap_multisend_TEST
12/17 Test #12: ring_gap_TEST ...................... Passed 1.98 sec
Start 15: ring_gap_permute1_TEST
13/17 Test #9: ring_spike_buffer_TEST ............. Passed 3.01 sec
Start 16: ring_gap_permute2_TEST
14/17 Test #13: ring_gap_binqueue_TEST ............. Passed 1.41 sec
15/17 Test #14: ring_gap_multisend_TEST ............ Passed 1.87 sec
16/17 Test #15: ring_gap_permute1_TEST ............. Passed 4.74 sec
17/17 Test #16: ring_gap_permute2_TEST ............. Passed 4.69 sec
100% tests passed, 0 tests failed out of 17
Total Test time (real) = 9.65 sec
$ cp -r Testing/ ${CI_PROJECT_DIR}/
$ module load unstable unit-test-translator
Autoloading python/3.9.7
$ cmake2junit > ${CI_PROJECT_DIR}/ctest.xml
$ exit ${i_am_a_failure}
section_end:1649843585:step_script section_start:1649843585:upload_artifacts_on_success Uploading artifacts for successful job
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=10621 revision=58ba2b95 version=14.2.0
initial_environment.env: found 1 matching files and directories
Testing/: found 7 matching files and directories 
Uploading artifacts as "archive" to coordinator... ok id=211034 responseStatus=201 Created token=w8yV_MWy
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=10653 revision=58ba2b95 version=14.2.0
ctest.xml: found 1 matching files and directories 
Uploading artifacts as "junit" to coordinator... ok id=211034 responseStatus=201 Created token=w8yV_MWy
section_end:1649843587:upload_artifacts_on_success section_start:1649843587:cleanup_file_variables Cleaning up project directory and file based variables
section_end:1649843588:cleanup_file_variables Job succeeded
Running with gitlab-runner 14.3.2 (e0218c92)
 on BB5 map runner pnPo3yJy
section_start:1649843621:resolve_secrets Resolving secrets
section_end:1649843621:resolve_secrets section_start:1649843621:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 14.3.2, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor127641747, slurm job id
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 48472, build ref 9a466a6e2a1c807e4d6045f3fc0b1fbaf51370b6, job ID 211031
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472, optional exclusive flag , optional cpus per task flag --cpus-per-task=1, optional qos flag
A slurm job will be created with name GL_J211031_PROD_P112_CP4_C3
Job parameters: memory=30750M, cpus_per_task=1, duration=1:00:00, constraint=volta ntasks=8 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
Submitted batch job 379199
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=8 --cpus-per-task=1 --mem=30750M --job-name=GL_J211031_PROD_P112_CP4_C3 -C volta --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472 --ntasks=8 --jobid=379199 --cpus-per-task=1 --mem=30750M
section_end:1649843623:prepare_executor section_start:1649843623:prepare_script Preparing environment
Running on ldir01u09.bbp.epfl.ch via bbpv1.epfl.ch...
section_end:1649843630:prepare_script section_start:1649843630:get_sources Getting source from Git repository
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1649843631:get_sources section_start:1649843631:download_artifacts Downloading artifacts
Downloading artifacts for build:coreneuron:mod2c:nvhpc:acc:unified (211020)...
Runtime platform  arch=amd64 os=linux pid=11819 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=211020 responseStatus=200 OK token=ihHsmuMo
section_end:1649843632:download_artifacts section_start:1649843632:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ export CTEST_PARALLEL_LEVEL=${SLURM_TASKS_PER_NODE}
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ cd ${SPACK_BUILD_DIR}
$ export BOOST_TEST_COLOR_OUTPUT=no
$ i_am_a_failure=0
$ spack build-env ${SPACK_FULL_SPEC} -- ctest --output-on-failure -T Test || i_am_a_failure=1
Site: r2i2n20
Build name: Linux-nvc++
Create new tag: 20220413-0954 - Experimental
Test project /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211020/spack-build/spack-stage-coreneuron-develop-knqybtff3rwigwjx34v3sgxt6qlziega/spack-build-knqybtf
Start 1: cmd_interface_test
Start 2: interleave_info_constructor_test
Start 3: alignment_test
Start 4: queuing_test
Start 5: lfp_test
Start 6: ring_TEST
Start 19: reporting_1
1/19 Test #4: queuing_test ........................... Passed 1.69 sec
2/19 Test #5: lfp_test ............................... Passed 1.97 sec
Start 7: ring_binqueue_TEST
3/19 Test #1: cmd_interface_test ..................... Passed 2.43 sec
4/19 Test #2: interleave_info_constructor_test ....... Passed 70.13 sec
Start 8: ring_multisend_TEST
5/19 Test #6: ring_TEST .............................. Passed 74.66 sec
Start 9: ring_spike_buffer_TEST
6/19 Test #19: reporting_1 ............................ Passed 78.12 sec
7/19 Test #3: alignment_test ......................... Passed 79.15 sec
Start 10: ring_permute1_TEST
8/19 Test #7: ring_binqueue_TEST ..................... Passed 86.91 sec
Start 11: ring_permute2_TEST
9/19 Test #9: ring_spike_buffer_TEST ................. Passed 67.63 sec
Start 12: ring_gap_TEST
10/19 Test #10: ring_permute1_TEST ..................... Passed 63.18 sec
Start 13: ring_gap_binqueue_TEST
11/19 Test #8: ring_multisend_TEST .................... Passed 72.25 sec
Start 14: ring_gap_multisend_TEST
12/19 Test #11: ring_permute2_TEST ..................... Passed 53.91 sec
Start 15: ring_gap_permute1_TEST
13/19 Test #13: ring_gap_binqueue_TEST ................. Passed 308.14 sec
Start 16: ring_gap_permute2_TEST
14/19 Test #15: ring_gap_permute1_TEST ................. Passed 307.74 sec
Start 17: ring_permute2_cudaInterface_TEST
15/19 Test #14: ring_gap_multisend_TEST ................ Passed 309.30 sec
Start 18: ring_gap_permute2_cudaInterface_TEST
16/19 Test #12: ring_gap_TEST .......................... Passed 309.57 sec
17/19 Test #17: ring_permute2_cudaInterface_TEST ....... Passed 32.55 sec
18/19 Test #18: ring_gap_permute2_cudaInterface_TEST ... Passed 59.20 sec
19/19 Test #16: ring_gap_permute2_TEST ................. Passed 61.63 sec
100% tests passed, 0 tests failed out of 19
Total Test time (real) = 512.13 sec
$ cp -r Testing/ ${CI_PROJECT_DIR}/
$ module load unstable unit-test-translator
Autoloading python/3.9.7
$ cmake2junit > ${CI_PROJECT_DIR}/ctest.xml
$ exit ${i_am_a_failure}
section_end:1649844171:step_script section_start:1649844171:upload_artifacts_on_success Uploading artifacts for successful job
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=31709 revision=58ba2b95 version=14.2.0
initial_environment.env: found 1 matching files and directories
Testing/: found 7 matching files and directories 
Uploading artifacts as "archive" to coordinator... ok id=211031 responseStatus=201 Created token=qHjYRuuq
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=31773 revision=58ba2b95 version=14.2.0
ctest.xml: found 1 matching files and directories 
Uploading artifacts as "junit" to coordinator... ok id=211031 responseStatus=201 Created token=qHjYRuuq
section_end:1649844172:upload_artifacts_on_success section_start:1649844172:cleanup_file_variables Cleaning up project directory and file based variables
section_end:1649844173:cleanup_file_variables Job succeeded
Running with gitlab-runner 14.3.2 (e0218c92)
 on BB5 map runner pnPo3yJy
section_start:1649843622:resolve_secrets Resolving secrets
section_end:1649843622:resolve_secrets section_start:1649843622:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 14.3.2, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor429934455, slurm job id
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 48472, build ref 9a466a6e2a1c807e4d6045f3fc0b1fbaf51370b6, job ID 211030
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472, optional exclusive flag , optional cpus per task flag --cpus-per-task=1, optional qos flag
A slurm job will be created with name GL_J211030_PROD_P112_CP7_C11
Job parameters: memory=30750M, cpus_per_task=1, duration=1:00:00, constraint=volta ntasks=8 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
Submitted batch job 379201
job state: PD
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=8 --cpus-per-task=1 --mem=30750M --job-name=GL_J211030_PROD_P112_CP7_C11 -C volta --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472 --ntasks=8 --jobid=379201 --cpus-per-task=1 --mem=30750M
section_end:1649843625:prepare_executor section_start:1649843625:prepare_script Preparing environment
Running on ldir01u09.bbp.epfl.ch via bbpv1.epfl.ch...
section_end:1649843633:prepare_script section_start:1649843633:get_sources Getting source from Git repository
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1649843634:get_sources section_start:1649843634:download_artifacts Downloading artifacts
Downloading artifacts for build:coreneuron:mod2c:nvhpc:acc (211019)...
Runtime platform  arch=amd64 os=linux pid=12418 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=211019 responseStatus=200 OK token=Qw7KHaR5
section_end:1649843635:download_artifacts section_start:1649843635:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ export CTEST_PARALLEL_LEVEL=${SLURM_TASKS_PER_NODE}
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ cd ${SPACK_BUILD_DIR}
$ export BOOST_TEST_COLOR_OUTPUT=no
$ i_am_a_failure=0
$ spack build-env ${SPACK_FULL_SPEC} -- ctest --output-on-failure -T Test || i_am_a_failure=1
Site: r2i2n20
Build name: Linux-nvc++
Create new tag: 20220413-0954 - Experimental
Test project /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211019/spack-build/spack-stage-coreneuron-develop-myaj5gmcll4sywf66372ze5qfubwzr3s/spack-build-myaj5gm
Start 1: cmd_interface_test
Start 2: interleave_info_constructor_test
Start 3: alignment_test
Start 4: queuing_test
Start 5: lfp_test
Start 6: ring_TEST
Start 19: reporting_1
1/19 Test #2: interleave_info_constructor_test ....... Passed 1.33 sec
2/19 Test #1: cmd_interface_test ..................... Passed 2.02 sec
Start 7: ring_binqueue_TEST
3/19 Test #3: alignment_test ......................... Passed 2.14 sec
4/19 Test #4: queuing_test ........................... Passed 2.31 sec
Start 8: ring_multisend_TEST
5/19 Test #5: lfp_test ............................... Passed 3.73 sec
6/19 Test #19: reporting_1 ............................ Passed 87.18 sec
Start 9: ring_spike_buffer_TEST
7/19 Test #7: ring_binqueue_TEST ..................... Passed 85.98 sec
Start 10: ring_permute1_TEST
8/19 Test #6: ring_TEST .............................. Passed 88.04 sec
Start 11: ring_permute2_TEST
9/19 Test #8: ring_multisend_TEST .................... Passed 85.77 sec
Start 12: ring_gap_TEST
10/19 Test #9: ring_spike_buffer_TEST ................. Passed 58.34 sec
Start 13: ring_gap_binqueue_TEST
11/19 Test #11: ring_permute2_TEST ..................... Passed 57.55 sec
Start 14: ring_gap_multisend_TEST
12/19 Test #12: ring_gap_TEST .......................... Passed 73.86 sec
Start 15: ring_gap_permute1_TEST
13/19 Test #10: ring_permute1_TEST ..................... Passed 87.11 sec
Start 16: ring_gap_permute2_TEST
14/19 Test #15: ring_gap_permute1_TEST ................. Passed 33.62 sec
Start 17: ring_permute2_cudaInterface_TEST
15/19 Test #14: ring_gap_multisend_TEST ................ Passed 57.76 sec
Start 18: ring_gap_permute2_cudaInterface_TEST
16/19 Test #13: ring_gap_binqueue_TEST ................. Passed 57.90 sec
17/19 Test #16: ring_gap_permute2_TEST ................. Passed 42.96 sec
18/19 Test #17: ring_permute2_cudaInterface_TEST ....... Passed 50.13 sec
19/19 Test #18: ring_gap_permute2_cudaInterface_TEST ... Passed 55.27 sec
100% tests passed, 0 tests failed out of 19
Total Test time (real) = 258.78 sec
$ cp -r Testing/ ${CI_PROJECT_DIR}/
$ module load unstable unit-test-translator
Autoloading python/3.9.7
$ cmake2junit > ${CI_PROJECT_DIR}/ctest.xml
$ exit ${i_am_a_failure}
section_end:1649843919:step_script section_start:1649843919:upload_artifacts_on_success Uploading artifacts for successful job
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=22463 revision=58ba2b95 version=14.2.0
initial_environment.env: found 1 matching files and directories
Testing/: found 7 matching files and directories 
Uploading artifacts as "archive" to coordinator... ok id=211030 responseStatus=201 Created token=jNC4Lzir
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=22495 revision=58ba2b95 version=14.2.0
ctest.xml: found 1 matching files and directories 
Uploading artifacts as "junit" to coordinator... ok id=211030 responseStatus=201 Created token=jNC4Lzir
section_end:1649843920:upload_artifacts_on_success section_start:1649843920:cleanup_file_variables Cleaning up project directory and file based variables
section_end:1649843921:cleanup_file_variables Job succeeded
Running with gitlab-runner 14.3.2 (e0218c92)
 on BB5 map runner pnPo3yJy
section_start:1649843982:resolve_secrets Resolving secrets
section_end:1649843982:resolve_secrets section_start:1649843982:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 14.3.2, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor647393288, slurm job id
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 48472, build ref 9a466a6e2a1c807e4d6045f3fc0b1fbaf51370b6, job ID 211035
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472, optional exclusive flag , optional cpus per task flag --cpus-per-task=1, optional qos flag
A slurm job will be created with name GL_J211035_PROD_P112_CP7_C9
Job parameters: memory=30750M, cpus_per_task=1, duration=1:00:00, constraint=cpu ntasks=8 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
sbatch: INFO: Job specifies cpu constraint, setting --constraint=[skl|clx]
Submitted batch job 379275
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=8 --cpus-per-task=1 --mem=30750M --job-name=GL_J211035_PROD_P112_CP7_C9 -C cpu --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472 --ntasks=8 --jobid=379275 --cpus-per-task=1 --mem=30750M
section_end:1649843984:prepare_executor section_start:1649843984:prepare_script Preparing environment
Running on r1i4n5 via bbpv1.epfl.ch...
section_end:1649843986:prepare_script section_start:1649843986:get_sources Getting source from Git repository
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1649843987:get_sources section_start:1649843987:download_artifacts Downloading artifacts
Downloading artifacts for build:coreneuron:nmodl:intel (211024)...
Runtime platform  arch=amd64 os=linux pid=231021 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=211024 responseStatus=200 OK token=ym6v9xs-
section_end:1649843988:download_artifacts section_start:1649843988:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ export CTEST_PARALLEL_LEVEL=${SLURM_TASKS_PER_NODE}
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ cd ${SPACK_BUILD_DIR}
$ export BOOST_TEST_COLOR_OUTPUT=no
$ i_am_a_failure=0
$ spack build-env ${SPACK_FULL_SPEC} -- ctest --output-on-failure -T Test || i_am_a_failure=1
Site: r5i0n34
Build name: Linux-icpc
Create new tag: 20220413-1000 - Experimental
Test project /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211024/spack-build/spack-stage-coreneuron-develop-x2m7t35nswt5wf62dii5jjfu45snneyv/spack-build-x2m7t35
Start 1: cmd_interface_test
Start 2: interleave_info_constructor_test
Start 3: alignment_test
Start 4: queuing_test
Start 5: lfp_test
Start 6: ring_TEST
Start 17: reporting_1
1/17 Test #3: alignment_test ..................... Passed 2.26 sec
2/17 Test #1: cmd_interface_test ................. Passed 2.50 sec
Start 7: ring_binqueue_TEST
3/17 Test #2: interleave_info_constructor_test ... Passed 2.67 sec
4/17 Test #4: queuing_test ....................... Passed 2.69 sec
Start 8: ring_multisend_TEST
5/17 Test #5: lfp_test ........................... Passed 2.80 sec
6/17 Test #17: reporting_1 ........................ Passed 2.90 sec
Start 9: ring_spike_buffer_TEST
7/17 Test #6: ring_TEST .......................... Passed 2.93 sec
Start 10: ring_permute1_TEST
8/17 Test #7: ring_binqueue_TEST ................. Passed 1.21 sec
Start 11: ring_permute2_TEST
9/17 Test #8: ring_multisend_TEST ................ Passed 1.89 sec
Start 12: ring_gap_TEST
10/17 Test #9: ring_spike_buffer_TEST ............. Passed 2.70 sec
Start 13: ring_gap_binqueue_TEST
11/17 Test #10: ring_permute1_TEST ................. Passed 2.70 sec
Start 14: ring_gap_multisend_TEST
12/17 Test #11: ring_permute2_TEST ................. Passed 2.66 sec
Start 15: ring_gap_permute1_TEST
13/17 Test #12: ring_gap_TEST ...................... Passed 1.84 sec
Start 16: ring_gap_permute2_TEST
14/17 Test #13: ring_gap_binqueue_TEST ............. Passed 1.43 sec
15/17 Test #14: ring_gap_multisend_TEST ............ Passed 1.90 sec
16/17 Test #15: ring_gap_permute1_TEST ............. Passed 1.96 sec
17/17 Test #16: ring_gap_permute2_TEST ............. Passed 1.95 sec
100% tests passed, 0 tests failed out of 17
Total Test time (real) = 8.40 sec
$ cp -r Testing/ ${CI_PROJECT_DIR}/
$ module load unstable unit-test-translator
Autoloading python/3.9.7
$ cmake2junit > ${CI_PROJECT_DIR}/ctest.xml
$ exit ${i_am_a_failure}
section_end:1649844025:step_script section_start:1649844025:upload_artifacts_on_success Uploading artifacts for successful job
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=232650 revision=58ba2b95 version=14.2.0
initial_environment.env: found 1 matching files and directories
Testing/: found 7 matching files and directories 
Uploading artifacts as "archive" to coordinator... ok id=211035 responseStatus=201 Created token=-VUszuXZ
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=232682 revision=58ba2b95 version=14.2.0
ctest.xml: found 1 matching files and directories 
Uploading artifacts as "junit" to coordinator... ok id=211035 responseStatus=201 Created token=-VUszuXZ
section_end:1649844027:upload_artifacts_on_success section_start:1649844027:cleanup_file_variables Cleaning up project directory and file based variables
section_end:1649844028:cleanup_file_variables Job succeeded
Running with gitlab-runner 14.3.2 (e0218c92)
 on BB5 map runner pnPo3yJy
section_start:1649844033:resolve_secrets Resolving secrets
section_end:1649844033:resolve_secrets section_start:1649844033:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 14.3.2, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor820945637, slurm job id
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 48472, build ref 9a466a6e2a1c807e4d6045f3fc0b1fbaf51370b6, job ID 211033
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472, optional exclusive flag , optional cpus per task flag --cpus-per-task=1, optional qos flag
A slurm job will be created with name GL_J211033_PROD_P112_CP7_C7
Job parameters: memory=30750M, cpus_per_task=1, duration=1:00:00, constraint=volta ntasks=8 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
Submitted batch job 379283
job state: PD
job state: PD
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=8 --cpus-per-task=1 --mem=30750M --job-name=GL_J211033_PROD_P112_CP7_C7 -C volta --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472 --ntasks=8 --jobid=379283 --cpus-per-task=1 --mem=30750M
section_end:1649844036:prepare_executor section_start:1649844036:prepare_script Preparing environment
Running on ldir01u09.bbp.epfl.ch via bbpv1.epfl.ch...
section_end:1649844045:prepare_script section_start:1649844045:get_sources Getting source from Git repository
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1649844047:get_sources section_start:1649844047:download_artifacts Downloading artifacts
Downloading artifacts for build:coreneuron:nmodl:nvhpc:acc (211022)...
Runtime platform  arch=amd64 os=linux pid=28013 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=211022 responseStatus=200 OK token=T5xFNgyz
section_end:1649844048:download_artifacts section_start:1649844048:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ export CTEST_PARALLEL_LEVEL=${SLURM_TASKS_PER_NODE}
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ cd ${SPACK_BUILD_DIR}
$ export BOOST_TEST_COLOR_OUTPUT=no
$ i_am_a_failure=0
$ spack build-env ${SPACK_FULL_SPEC} -- ctest --output-on-failure -T Test || i_am_a_failure=1
Site: r1i4n0
Build name: Linux-nvc++
Create new tag: 20220413-1001 - Experimental
Test project /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211022/spack-build/spack-stage-coreneuron-develop-7eavmcfdqg53mknhnfbcd3xnco2pxk7o/spack-build-7eavmcf
Start 1: cmd_interface_test
Start 2: interleave_info_constructor_test
Start 3: alignment_test
Start 4: queuing_test
Start 5: lfp_test
Start 6: ring_TEST
Start 19: reporting_1
1/19 Test #6: ring_TEST .............................. Passed 10.78 sec
Start 7: ring_binqueue_TEST
2/19 Test #4: queuing_test ........................... Passed 10.85 sec
3/19 Test #1: cmd_interface_test ..................... Passed 11.36 sec
Start 8: ring_multisend_TEST
4/19 Test #3: alignment_test ......................... Passed 11.53 sec
5/19 Test #19: reporting_1 ............................ Passed 11.61 sec
Start 9: ring_spike_buffer_TEST
6/19 Test #2: interleave_info_constructor_test ....... Passed 11.90 sec
7/19 Test #5: lfp_test ............................... Passed 12.28 sec
Start 10: ring_permute1_TEST
8/19 Test #8: ring_multisend_TEST .................... Passed 11.62 sec
Start 11: ring_permute2_TEST
9/19 Test #9: ring_spike_buffer_TEST ................. Passed 28.59 sec
Start 12: ring_gap_TEST
10/19 Test #10: ring_permute1_TEST ..................... Passed 28.71 sec
Start 13: ring_gap_binqueue_TEST
11/19 Test #7: ring_binqueue_TEST ..................... Passed 30.42 sec
Start 14: ring_gap_multisend_TEST
12/19 Test #11: ring_permute2_TEST ..................... Passed 23.95 sec
Start 15: ring_gap_permute1_TEST
13/19 Test #13: ring_gap_binqueue_TEST ................. Passed 50.47 sec
Start 16: ring_gap_permute2_TEST
14/19 Test #12: ring_gap_TEST .......................... Passed 54.65 sec
Start 17: ring_permute2_cudaInterface_TEST
15/19 Test #15: ring_gap_permute1_TEST ................. Passed 56.55 sec
Start 18: ring_gap_permute2_cudaInterface_TEST
16/19 Test #14: ring_gap_multisend_TEST ................ Passed 62.32 sec
17/19 Test #16: ring_gap_permute2_TEST ................. Passed 28.07 sec
18/19 Test #17: ring_permute2_cudaInterface_TEST ....... Passed 24.92 sec
19/19 Test #18: ring_gap_permute2_cudaInterface_TEST ... Passed 18.92 sec
100% tests passed, 0 tests failed out of 19
Total Test time (real) = 122.42 sec
$ cp -r Testing/ ${CI_PROJECT_DIR}/
$ module load unstable unit-test-translator
Autoloading python/3.9.7
$ cmake2junit > ${CI_PROJECT_DIR}/ctest.xml
$ exit ${i_am_a_failure}
section_end:1649844200:step_script section_start:1649844200:upload_artifacts_on_success Uploading artifacts for successful job
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=32432 revision=58ba2b95 version=14.2.0
initial_environment.env: found 1 matching files and directories
Testing/: found 7 matching files and directories 
Uploading artifacts as "archive" to coordinator... ok id=211033 responseStatus=201 Created token=wxprY66y
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=32462 revision=58ba2b95 version=14.2.0
ctest.xml: found 1 matching files and directories 
Uploading artifacts as "junit" to coordinator... ok id=211033 responseStatus=201 Created token=wxprY66y
section_end:1649844201:upload_artifacts_on_success section_start:1649844201:cleanup_file_variables Cleaning up project directory and file based variables
section_end:1649844202:cleanup_file_variables Job succeeded
Running with gitlab-runner 14.3.2 (e0218c92)
 on BB5 map runner pnPo3yJy
section_start:1649844034:resolve_secrets Resolving secrets
section_end:1649844034:resolve_secrets section_start:1649844034:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 14.3.2, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor487356905, slurm job id
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 48472, build ref 9a466a6e2a1c807e4d6045f3fc0b1fbaf51370b6, job ID 211032
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472, optional exclusive flag , optional cpus per task flag --cpus-per-task=1, optional qos flag
A slurm job will be created with name GL_J211032_PROD_P112_CP9_C10
Job parameters: memory=30750M, cpus_per_task=1, duration=1:00:00, constraint=volta ntasks=8 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
Submitted batch job 379285
job state: PD
job state: PD
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=8 --cpus-per-task=1 --mem=30750M --job-name=GL_J211032_PROD_P112_CP9_C10 -C volta --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472 --ntasks=8 --jobid=379285 --cpus-per-task=1 --mem=30750M
section_end:1649844037:prepare_executor section_start:1649844037:prepare_script Preparing environment
Running on ldir01u09.bbp.epfl.ch via bbpv1.epfl.ch...
section_end:1649844045:prepare_script section_start:1649844045:get_sources Getting source from Git repository
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1649844047:get_sources section_start:1649844047:download_artifacts Downloading artifacts
Downloading artifacts for build:coreneuron:nmodl:nvhpc:omp (211021)...
Runtime platform  arch=amd64 os=linux pid=27942 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=211021 responseStatus=200 OK token=Koj5cr-4
section_end:1649844048:download_artifacts section_start:1649844048:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ export CTEST_PARALLEL_LEVEL=${SLURM_TASKS_PER_NODE}
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ cd ${SPACK_BUILD_DIR}
$ export BOOST_TEST_COLOR_OUTPUT=no
$ i_am_a_failure=0
$ spack build-env ${SPACK_FULL_SPEC} -- ctest --output-on-failure -T Test || i_am_a_failure=1
Site: r1i4n0
Build name: Linux-nvc++
Create new tag: 20220413-1001 - Experimental
Test project /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211021/spack-build/spack-stage-coreneuron-develop-dqfui2d7qu3mc6w6xhrktts7kg2pjeio/spack-build-dqfui2d
Start 1: cmd_interface_test
Start 2: interleave_info_constructor_test
Start 3: alignment_test
Start 4: queuing_test
Start 5: lfp_test
Start 6: ring_TEST
Start 19: reporting_1
1/19 Test #1: cmd_interface_test ..................... Passed 11.03 sec
2/19 Test #3: alignment_test ......................... Passed 11.04 sec
Start 7: ring_binqueue_TEST
3/19 Test #4: queuing_test ........................... Passed 11.48 sec
4/19 Test #2: interleave_info_constructor_test ....... Passed 11.80 sec
Start 8: ring_multisend_TEST
5/19 Test #5: lfp_test ............................... Passed 12.26 sec
6/19 Test #6: ring_TEST .............................. Passed 13.11 sec
Start 9: ring_spike_buffer_TEST
7/19 Test #19: reporting_1 ............................ Passed 13.41 sec
Start 10: ring_permute1_TEST
8/19 Test #9: ring_spike_buffer_TEST ................. Passed 41.12 sec
Start 11: ring_permute2_TEST
9/19 Test #8: ring_multisend_TEST .................... Passed 42.49 sec
Start 12: ring_gap_TEST
10/19 Test #10: ring_permute1_TEST ..................... Passed 40.92 sec
Start 13: ring_gap_binqueue_TEST
11/19 Test #7: ring_binqueue_TEST ..................... Passed 43.30 sec
Start 14: ring_gap_multisend_TEST
12/19 Test #11: ring_permute2_TEST ..................... Passed 42.17 sec
Start 15: ring_gap_permute1_TEST
13/19 Test #14: ring_gap_multisend_TEST ................ Passed 151.75 sec
Start 16: ring_gap_permute2_TEST
14/19 Test #13: ring_gap_binqueue_TEST ................. Passed 161.85 sec
Start 17: ring_permute2_cudaInterface_TEST
15/19 Test #12: ring_gap_TEST .......................... Passed 162.25 sec
Start 18: ring_gap_permute2_cudaInterface_TEST
16/19 Test #15: ring_gap_permute1_TEST ................. Passed 123.52 sec
17/19 Test #16: ring_gap_permute2_TEST ................. Passed 14.88 sec
18/19 Test #17: ring_permute2_cudaInterface_TEST ....... Passed 8.42 sec
19/19 Test #18: ring_gap_permute2_cudaInterface_TEST ... Passed 10.86 sec
100% tests passed, 0 tests failed out of 19
Total Test time (real) = 227.44 sec
$ cp -r Testing/ ${CI_PROJECT_DIR}/
$ module load unstable unit-test-translator
Autoloading python/3.9.7
$ cmake2junit > ${CI_PROJECT_DIR}/ctest.xml
$ exit ${i_am_a_failure}
section_end:1649844305:step_script section_start:1649844305:upload_artifacts_on_success Uploading artifacts for successful job
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=33364 revision=58ba2b95 version=14.2.0
initial_environment.env: found 1 matching files and directories
Testing/: found 7 matching files and directories 
Uploading artifacts as "archive" to coordinator... ok id=211032 responseStatus=201 Created token=suhroZAj
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=33395 revision=58ba2b95 version=14.2.0
ctest.xml: found 1 matching files and directories 
Uploading artifacts as "junit" to coordinator... ok id=211032 responseStatus=201 Created token=suhroZAj
section_end:1649844306:upload_artifacts_on_success section_start:1649844306:cleanup_file_variables Cleaning up project directory and file based variables
section_end:1649844306:cleanup_file_variables Job succeeded
Running with gitlab-runner 14.3.2 (e0218c92)
 on BB5 map runner pnPo3yJy
section_start:1649844161:resolve_secrets Resolving secrets
section_end:1649844161:resolve_secrets section_start:1649844161:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 14.3.2, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor323865865, slurm job id
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 48472, build ref 9a466a6e2a1c807e4d6045f3fc0b1fbaf51370b6, job ID 211039
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472, optional exclusive flag , optional cpus per task flag --cpus-per-task=1, optional qos flag
A slurm job will be created with name GL_J211039_PROD_P112_CP5_C11
Job parameters: memory=76G, cpus_per_task=1, duration=1:00:00, constraint=cpu ntasks=16 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
sbatch: INFO: Job specifies cpu constraint, setting --constraint=[skl|clx]
Submitted batch job 379294
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=16 --cpus-per-task=1 --mem=76G --job-name=GL_J211039_PROD_P112_CP5_C11 -C cpu --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472 --ntasks=16 --jobid=379294 --cpus-per-task=1 --mem=76G
section_end:1649844163:prepare_executor section_start:1649844163:prepare_script Preparing environment
Running on r1i7n21 via bbpv1.epfl.ch...
section_end:1649844166:prepare_script section_start:1649844166:get_sources Getting source from Git repository
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1649844168:get_sources section_start:1649844168:download_artifacts Downloading artifacts
Downloading artifacts for build:neuron:mod2c:intel (211028)...
Runtime platform  arch=amd64 os=linux pid=32894 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=211028 responseStatus=200 OK token=Y6xPs83u
section_end:1649844169:download_artifacts section_start:1649844169:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ export CTEST_PARALLEL_LEVEL=${SLURM_TASKS_PER_NODE}
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ cd ${SPACK_BUILD_DIR}
$ export BOOST_TEST_COLOR_OUTPUT=no
$ i_am_a_failure=0
$ spack build-env ${SPACK_FULL_SPEC} -- ctest --output-on-failure -T Test || i_am_a_failure=1
Site: r1i7n21
Build name: Linux-icpc
Create new tag: 20220413-1003 - Experimental
Test project /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211028/spack-build/spack-stage-neuron-develop-47kwkdqbfahzhd6mgo3d7lvidjenlcwo/spack-build-47kwkdq
Start 48: external_ringtest::coreneuron_cpu_mpi_offline::preparation
Start 55: testcorenrn_bbcore::coreneuron_cpu_offline::preparation
Start 61: testcorenrn_conc::coreneuron_cpu_offline::preparation
Start 67: testcorenrn_deriv::coreneuron_cpu_offline::preparation
Start 73: testcorenrn_gf::coreneuron_cpu_offline::preparation
Start 79: testcorenrn_kin::coreneuron_cpu_offline::preparation
Start 84: testcorenrn_patstim::coreneuron_cpu_offline::preparation
Start 90: testcorenrn_vecplay::coreneuron_cpu_offline::preparation
Start 96: testcorenrn_vecevent::coreneuron_cpu_offline::preparation
1/120 Test #55: testcorenrn_bbcore::coreneuron_cpu_offline::preparation ............ Passed 4.12 sec
Start 43: external_ringtest::neuron
2/120 Test #79: testcorenrn_kin::coreneuron_cpu_offline::preparation ............... Passed 4.26 sec
Start 51: testcorenrn_bbcore::neuron
3/120 Test #61: testcorenrn_conc::coreneuron_cpu_offline::preparation .............. Passed 6.43 sec
Start 52: testcorenrn_bbcore::coreneuron_cpu_online
4/120 Test #67: testcorenrn_deriv::coreneuron_cpu_offline::preparation ............. Passed 9.14 sec
Start 53: testcorenrn_bbcore::coreneuron_cpu_online_psolve_alternate
5/120 Test #90: testcorenrn_vecplay::coreneuron_cpu_offline::preparation ........... Passed 11.78 sec
Start 102: testcorenrn_watch::coreneuron_cpu_offline::preparation
6/120 Test #73: testcorenrn_gf::coreneuron_cpu_offline::preparation ................ Passed 14.94 sec
Start 44: external_ringtest::neuron_mpi
7/120 Test #84: testcorenrn_patstim::coreneuron_cpu_offline::preparation ........... Passed 15.54 sec
Start 45: external_ringtest::coreneuron_cpu_mpi_offline_saverestore
8/120 Test #51: testcorenrn_bbcore::neuron ......................................... Passed 11.21 sec
Start 54: testcorenrn_bbcore::coreneuron_cpu_offline
9/120 Test #96: testcorenrn_vecevent::coreneuron_cpu_offline::preparation .......... Passed 16.00 sec
Start 117: olfactory-bulb-3d::neuron::preparation
10/120 Test #117: olfactory-bulb-3d::neuron::preparation ............................. Passed 1.07 sec
Start 119: olfactory-bulb-3d::coreneuron_cpu_online::preparation
11/120 Test #119: olfactory-bulb-3d::coreneuron_cpu_online::preparation .............. Passed 0.11 sec
Start 40: reduced_dentate::neuron
12/120 Test #43: external_ringtest::neuron .......................................... Passed 13.05 sec
Start 57: testcorenrn_conc::neuron
13/120 Test #53: testcorenrn_bbcore::coreneuron_cpu_online_psolve_alternate ......... Passed 8.92 sec
Start 58: testcorenrn_conc::coreneuron_cpu_online
14/120 Test #52: testcorenrn_bbcore::coreneuron_cpu_online .......................... Passed 13.34 sec
Start 59: testcorenrn_conc::coreneuron_cpu_online_psolve_alternate
15/120 Test #54: testcorenrn_bbcore::coreneuron_cpu_offline ......................... Passed 4.35 sec
Start 60: testcorenrn_conc::coreneuron_cpu_offline
16/120 Test #48: external_ringtest::coreneuron_cpu_mpi_offline::preparation ......... Passed 20.35 sec
Start 46: external_ringtest::coreneuron_cpu_mpi
17/120 Test #102: testcorenrn_watch::coreneuron_cpu_offline::preparation ............. Passed 8.69 sec
Start 47: external_ringtest::coreneuron_cpu_mpi_offline
18/120 Test #57: testcorenrn_conc::neuron ........................................... Passed 5.06 sec
Start 63: testcorenrn_deriv::neuron
19/120 Test #60: testcorenrn_conc::coreneuron_cpu_offline ........................... Passed 5.51 sec
Start 64: testcorenrn_deriv::coreneuron_cpu_online
20/120 Test #47: external_ringtest::coreneuron_cpu_mpi_offline ...................... Passed 9.24 sec
Start 65: testcorenrn_deriv::coreneuron_cpu_online_psolve_alternate
Start 66: testcorenrn_deriv::coreneuron_cpu_offline
21/120 Test #44: external_ringtest::neuron_mpi ...................................... Passed 15.92 sec
Start 69: testcorenrn_gf::neuron
22/120 Test #58: testcorenrn_conc::coreneuron_cpu_online ............................ Passed 15.10 sec
Start 75: testcorenrn_kin::neuron
23/120 Test #66: testcorenrn_deriv::coreneuron_cpu_offline .......................... Passed 5.31 sec
Start 76: testcorenrn_kin::coreneuron_cpu_online
24/120 Test #63: testcorenrn_deriv::neuron .......................................... Passed 13.88 sec
Start 77: testcorenrn_kin::coreneuron_cpu_online_psolve_alternate
25/120 Test #59: testcorenrn_conc::coreneuron_cpu_online_psolve_alternate ........... Passed 18.79 sec
Start 78: testcorenrn_kin::coreneuron_cpu_offline
26/120 Test #40: reduced_dentate::neuron ............................................ Passed 22.30 sec
Start 41: reduced_dentate::coreneuron_cpu
27/120 Test #69: testcorenrn_gf::neuron ............................................. Passed 9.33 sec
Start 70: testcorenrn_gf::coreneuron_cpu_online
28/120 Test #78: testcorenrn_kin::coreneuron_cpu_offline ............................ Passed 3.87 sec
Start 108: channel_benchmark_hippo::neuron
29/120 Test #64: testcorenrn_deriv::coreneuron_cpu_online ........................... Passed 17.06 sec
Start 109: channel_benchmark_hippo::coreneuron_cpu_online
30/120 Test #75: testcorenrn_kin::neuron ............................................ Passed 9.39 sec
Start 110: channel_benchmark_hippo::coreneuron_cpu_filemode
31/120 Test #46: external_ringtest::coreneuron_cpu_mpi .............................. Passed 22.50 sec
Start 71: testcorenrn_gf::coreneuron_cpu_online_psolve_alternate
32/120 Test #65: testcorenrn_deriv::coreneuron_cpu_online_psolve_alternate .......... Passed 13.38 sec
Start 112: channel_benchmark_sscx::neuron
33/120 Test #76: testcorenrn_kin::coreneuron_cpu_online ............................. Passed 7.96 sec
Start 113: channel_benchmark_sscx::coreneuron_cpu_online
34/120 Test #77: testcorenrn_kin::coreneuron_cpu_online_psolve_alternate ............ Passed 7.41 sec
Start 114: channel_benchmark_sscx::coreneuron_cpu_filemode
35/120 Test #70: testcorenrn_gf::coreneuron_cpu_online .............................. Passed 17.01 sec
Start 72: testcorenrn_gf::coreneuron_cpu_offline
36/120 Test #45: external_ringtest::coreneuron_cpu_mpi_offline_saverestore .......... Passed 46.66 sec
Start 81: testcorenrn_patstim::neuron
37/120 Test #72: testcorenrn_gf::coreneuron_cpu_offline ............................. Passed 8.87 sec
Start 82: testcorenrn_patstim::coreneuron_cpu_offline_saverestore
38/120 Test #81: testcorenrn_patstim::neuron ........................................ Passed 15.24 sec
Start 83: testcorenrn_patstim::coreneuron_cpu_offline
39/120 Test #71: testcorenrn_gf::coreneuron_cpu_online_psolve_alternate ............. Passed 41.49 sec
Start 86: testcorenrn_vecplay::neuron
40/120 Test #83: testcorenrn_patstim::coreneuron_cpu_offline ........................ Passed 7.70 sec
Start 87: testcorenrn_vecplay::coreneuron_cpu_online
41/120 Test #82: testcorenrn_patstim::coreneuron_cpu_offline_saverestore ............ Passed 31.05 sec
Start 88: testcorenrn_vecplay::coreneuron_cpu_online_psolve_alternate
42/120 Test #86: testcorenrn_vecplay::neuron ........................................ Passed 18.84 sec
Start 89: testcorenrn_vecplay::coreneuron_cpu_offline
43/120 Test #89: testcorenrn_vecplay::coreneuron_cpu_offline ........................ Passed 6.54 sec
Start 98: testcorenrn_watch::neuron
44/120 Test #87: testcorenrn_vecplay::coreneuron_cpu_online ......................... Passed 32.55 sec
Start 99: testcorenrn_watch::coreneuron_cpu_online
45/120 Test #98: testcorenrn_watch::neuron .......................................... Passed 17.17 sec
Start 100: testcorenrn_watch::coreneuron_cpu_online_psolve_alternate
46/120 Test #41: reduced_dentate::coreneuron_cpu .................................... Passed 92.84 sec
Start 92: testcorenrn_vecevent::neuron
47/120 Test #88: testcorenrn_vecplay::coreneuron_cpu_online_psolve_alternate ........ Passed 38.03 sec
Start 101: testcorenrn_watch::coreneuron_cpu_offline
48/120 Test #101: testcorenrn_watch::coreneuron_cpu_offline .......................... Passed 10.48 sec
Start 104: testcorenrn_netstimdirect::direct_netstimdirect
49/120 Test #92: testcorenrn_vecevent::neuron ....................................... Passed 23.01 sec
Start 93: testcorenrn_vecevent::coreneuron_cpu_online
50/120 Test #99: testcorenrn_watch::coreneuron_cpu_online ........................... Passed 47.80 sec
Start 105: testcorenrn_netstimdirect::direct_netstimdirect_psolve_alternate
51/120 Test #100: testcorenrn_watch::coreneuron_cpu_online_psolve_alternate .......... Passed 47.16 sec
Start 1: testneuron
Start 2: ringtest
52/120 Test #1: testneuron ......................................................... Passed 0.94 sec
Start 3: connect_dend
53/120 Test #2: ringtest ........................................................... Passed 3.37 sec
Start 4: mpi_init::nrniv_mpiopt
54/120 Test #3: connect_dend ....................................................... Passed 5.38 sec
Start 5: mpi_init::nrniv_nrnmpi_init
55/120 Test #5: mpi_init::nrniv_nrnmpi_init ........................................ Passed 2.37 sec
Start 6: mpi_init::python_nrnmpi_init
56/120 Test #4: mpi_init::nrniv_mpiopt ............................................. Passed 6.73 sec
Start 7: mpi_init::python_mpienv
57/120 Test #104: testcorenrn_netstimdirect::direct_netstimdirect .................... Passed 38.96 sec
Start 8: mpi_init::nrniv_mpiexec_mpiopt
Start 9: mpi_init::nrniv_mpiexec_nrnmpi_init
58/120 Test #9: mpi_init::nrniv_mpiexec_nrnmpi_init ................................ Passed 7.64 sec
Start 10: mpi_init::python_mpiexec_nrnmpi_init
59/120 Test #6: mpi_init::python_nrnmpi_init ....................................... Passed 9.77 sec
Start 11: mpi_init::python_mpiexec_mpienv
60/120 Test #8: mpi_init::nrniv_mpiexec_mpiopt ..................................... Passed 10.24 sec
Start 12: pynrn::basic_tests
61/120 Test #109: channel_benchmark_hippo::coreneuron_cpu_online ..................... Passed 154.63 sec
Start 13: parallel_tests
62/120 Test #93: testcorenrn_vecevent::coreneuron_cpu_online ........................ Passed 46.66 sec
Start 94: testcorenrn_vecevent::coreneuron_cpu_online_psolve_alternate
63/120 Test #7: mpi_init::python_mpienv ............................................ Passed 23.04 sec
Start 14: parallel_partrans
64/120 Test #114: channel_benchmark_sscx::coreneuron_cpu_filemode .................... Passed 164.08 sec
Start 15: parallel_netpar
65/120 Test #113: channel_benchmark_sscx::coreneuron_cpu_online ...................... Passed 164.58 sec
Start 16: parallel_bas
66/120 Test #112: channel_benchmark_sscx::neuron ..................................... Passed 164.96 sec
Start 17: coreneuron_modtests::version_macros
67/120 Test #10: mpi_init::python_mpiexec_nrnmpi_init ............................... Passed 15.85 sec
Start 18: coreneuron_modtests::fornetcon_py_cpu
68/120 Test #108: channel_benchmark_hippo::neuron .................................... Passed 165.95 sec
Start 19: coreneuron_modtests::direct_py_cpu
69/120 Test #110: channel_benchmark_hippo::coreneuron_cpu_filemode ................... Passed 165.88 sec
Start 20: coreneuron_modtests::direct_hoc_cpu
70/120 Test #105: testcorenrn_netstimdirect::direct_netstimdirect_psolve_alternate ... Passed 42.88 sec
Start 21: coreneuron_modtests::spikes_py_cpu
Start 22: coreneuron_modtests::spikes_file_mode_py_cpu
71/120 Test #11: mpi_init::python_mpiexec_mpienv .................................... Passed 16.73 sec
Start 23: coreneuron_modtests::fast_imem_py_cpu
72/120 Test #13: parallel_tests ..................................................... Passed 26.77 sec
Start 24: coreneuron_modtests::datareturn_py_cpu
73/120 Test #15: parallel_netpar .................................................... Passed 28.49 sec
Start 25: coreneuron_modtests::test_units_py_cpu
74/120 Test #20: coreneuron_modtests::direct_hoc_cpu ................................ Passed 28.63 sec
Start 26: coreneuron_modtests::test_netmove_py_cpu
75/120 Test #14: parallel_partrans .................................................. Passed 31.35 sec
Start 27: coreneuron_modtests::test_pointer_py_cpu
76/120 Test #94: testcorenrn_vecevent::coreneuron_cpu_online_psolve_alternate ....... Passed 37.86 sec
Start 95: testcorenrn_vecevent::coreneuron_cpu_offline
77/120 Test #17: coreneuron_modtests::version_macros ................................ Passed 37.87 sec
Start 28: coreneuron_modtests::test_watchrange_py_cpu
78/120 Test #23: coreneuron_modtests::fast_imem_py_cpu .............................. Passed 37.54 sec
Start 29: coreneuron_modtests::test_psolve_py_cpu
79/120 Test #22: coreneuron_modtests::spikes_file_mode_py_cpu ....................... Passed 38.13 sec
Start 30: coreneuron_modtests::test_ba_py_cpu
80/120 Test #21: coreneuron_modtests::spikes_py_cpu ................................. Passed 38.29 sec
Start 31: coreneuron_modtests::test_natrans_py_cpu
81/120 Test #19: coreneuron_modtests::direct_py_cpu ................................. Passed 38.88 sec
Start 35: modlunit_unitstest
82/120 Test #18: coreneuron_modtests::fornetcon_py_cpu .............................. Passed 40.89 sec
Start 36: modlunit_hh
83/120 Test #95: testcorenrn_vecevent::coreneuron_cpu_offline ....................... Passed 9.35 sec
Start 116: olfactory-bulb-3d::neuron
84/120 Test #36: modlunit_hh ........................................................ Passed 0.51 sec
Start 37: modlunit_stim
85/120 Test #35: modlunit_unitstest ................................................. Passed 2.46 sec
Start 38: modlunit_pattern
86/120 Test #38: modlunit_pattern ................................................... Passed 0.18 sec
Start 39: external_nrntest
87/120 Test #37: modlunit_stim ...................................................... Passed 0.38 sec
Start 42: reduced_dentate::compare_results
88/120 Test #24: coreneuron_modtests::datareturn_py_cpu ............................. Passed 26.60 sec
Start 56: testcorenrn_bbcore::compare_results
89/120 Test #42: reduced_dentate::compare_results ................................... Passed 1.49 sec
Start 62: testcorenrn_conc::compare_results
90/120 Test #56: testcorenrn_bbcore::compare_results ................................ Passed 1.97 sec
Start 68: testcorenrn_deriv::compare_results
91/120 Test #62: testcorenrn_conc::compare_results .................................. Passed 2.23 sec
Start 74: testcorenrn_gf::compare_results
92/120 Test #68: testcorenrn_deriv::compare_results ................................. Passed 2.48 sec
Start 80: testcorenrn_kin::compare_results
93/120 Test #25: coreneuron_modtests::test_units_py_cpu ............................. Passed 19.94 sec
Start 85: testcorenrn_patstim::compare_results
94/120 Test #74: testcorenrn_gf::compare_results .................................... Passed 3.05 sec
Start 91: testcorenrn_vecplay::compare_results
95/120 Test #80: testcorenrn_kin::compare_results ................................... Passed 2.86 sec
Start 97: testcorenrn_vecevent::compare_results
96/120 Test #91: testcorenrn_vecplay::compare_results ............................... Passed 2.16 sec
Start 103: testcorenrn_watch::compare_results
97/120 Test #97: testcorenrn_vecevent::compare_results .............................. Passed 2.34 sec
Start 106: testcorenrn_netstimdirect::compare_results
98/120 Test #85: testcorenrn_patstim::compare_results ............................... Passed 4.38 sec
Start 111: channel_benchmark_hippo::compare_results
99/120 Test #103: testcorenrn_watch::compare_results ................................. Passed 3.33 sec
Start 115: channel_benchmark_sscx::compare_results
100/120 Test #111: channel_benchmark_hippo::compare_results ........................... Passed 2.12 sec
101/120 Test #106: testcorenrn_netstimdirect::compare_results ......................... Passed 3.21 sec
Start 32: coreneuron_modtests::spikes_mpi_py_cpu
102/120 Test #26: coreneuron_modtests::test_netmove_py_cpu ........................... Passed 28.24 sec
103/120 Test #115: channel_benchmark_sscx::compare_results ............................ Passed 3.41 sec
Start 33: coreneuron_modtests::spikes_mpi_file_mode_py_cpu
104/120 Test #16: parallel_bas ....................................................... Passed 69.96 sec
105/120 Test #30: coreneuron_modtests::test_ba_py_cpu ................................ Passed 32.84 sec
Start 34: coreneuron_modtests::inputpresyn_py_cpu
106/120 Test #27: coreneuron_modtests::test_pointer_py_cpu ........................... Passed 45.88 sec
107/120 Test #33: coreneuron_modtests::spikes_mpi_file_mode_py_cpu ................... Passed 18.64 sec
108/120 Test #29: coreneuron_modtests::test_psolve_py_cpu ............................ Passed 42.06 sec
Start 118: olfactory-bulb-3d::coreneuron_cpu_online
109/120 Test #31: coreneuron_modtests::test_natrans_py_cpu ........................... Passed 42.76 sec
110/120 Test #28: coreneuron_modtests::test_watchrange_py_cpu ........................ Passed 45.80 sec
111/120 Test #32: coreneuron_modtests::spikes_mpi_py_cpu ............................. Passed 33.32 sec
112/120 Test #12: pynrn::basic_tests ................................................. Passed 109.26 sec
113/120 Test #34: coreneuron_modtests::inputpresyn_py_cpu ............................ Passed 35.98 sec
Start 49: external_ringtest::coreneuron_cpu_mpi_threads
114/120 Test #49: external_ringtest::coreneuron_cpu_mpi_threads ...................... Passed 53.98 sec
Start 50: external_ringtest::compare_results
115/120 Test #50: external_ringtest::compare_results ................................. Passed 0.71 sec
116/120 Test #116: olfactory-bulb-3d::neuron .......................................... Passed 140.53 sec
117/120 Test #118: olfactory-bulb-3d::coreneuron_cpu_online ........................... Passed 116.70 sec
Start 120: olfactory-bulb-3d::compare_results
118/120 Test #120: olfactory-bulb-3d::compare_results ................................. Passed 0.73 sec
119/120 Test #39: external_nrntest ................................................... Passed 298.72 sec
Start 107: tqperf::coreneuron
120/120 Test #107: tqperf::coreneuron ................................................. Passed 11.65 sec
100% tests passed, 0 tests failed out of 120
Total Test time (real) = 560.60 sec
$ cp -r Testing/ ${CI_PROJECT_DIR}/
$ module load unstable unit-test-translator
Autoloading python/3.9.7
$ cmake2junit > ${CI_PROJECT_DIR}/ctest.xml
$ exit ${i_am_a_failure}
section_end:1649844780:step_script section_start:1649844780:upload_artifacts_on_success Uploading artifacts for successful job
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=52852 revision=58ba2b95 version=14.2.0
initial_environment.env: found 1 matching files and directories
Testing/: found 7 matching files and directories 
Uploading artifacts as "archive" to coordinator... ok id=211039 responseStatus=201 Created token=8F8WMwP8
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=52902 revision=58ba2b95 version=14.2.0
ctest.xml: found 1 matching files and directories 
Uploading artifacts as "junit" to coordinator... ok id=211039 responseStatus=201 Created token=8F8WMwP8
section_end:1649844782:upload_artifacts_on_success section_start:1649844782:cleanup_file_variables Cleaning up project directory and file based variables
section_end:1649844783:cleanup_file_variables Job succeeded
Running with gitlab-runner 14.3.2 (e0218c92)
 on BB5 map runner pnPo3yJy
section_start:1649844815:resolve_secrets Resolving secrets
section_end:1649844815:resolve_secrets section_start:1649844815:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 14.3.2, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor822041385, slurm job id
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 48472, build ref 9a466a6e2a1c807e4d6045f3fc0b1fbaf51370b6, job ID 211036
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472, optional exclusive flag , optional cpus per task flag --cpus-per-task=1, optional qos flag
A slurm job will be created with name GL_J211036_PROD_P112_CP6_C5
Job parameters: memory=76G, cpus_per_task=1, duration=1:00:00, constraint=volta ntasks=16 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
Submitted batch job 379354
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=16 --cpus-per-task=1 --mem=76G --job-name=GL_J211036_PROD_P112_CP6_C5 -C volta --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472 --ntasks=16 --jobid=379354 --cpus-per-task=1 --mem=76G
section_end:1649844817:prepare_executor section_start:1649844817:prepare_script Preparing environment
Running on ldir01u09.bbp.epfl.ch via bbpv1.epfl.ch...
section_end:1649844825:prepare_script section_start:1649844825:get_sources Getting source from Git repository
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1649844826:get_sources section_start:1649844826:download_artifacts Downloading artifacts
Downloading artifacts for build:neuron:mod2c:nvhpc:acc (211025)...
Runtime platform  arch=amd64 os=linux pid=38548 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=211025 responseStatus=200 OK token=xznJFiB7
section_end:1649844827:download_artifacts section_start:1649844827:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ export CTEST_PARALLEL_LEVEL=${SLURM_TASKS_PER_NODE}
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ cd ${SPACK_BUILD_DIR}
$ export BOOST_TEST_COLOR_OUTPUT=no
$ i_am_a_failure=0
$ spack build-env ${SPACK_FULL_SPEC} -- ctest --output-on-failure -T Test || i_am_a_failure=1
Site: r1i7n21
Build name: Linux-nvc++
Create new tag: 20220413-1014 - Experimental
Test project /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211025/spack-build/spack-stage-neuron-develop-sqrws2vsdssc3xwshyyrvkm5oyeurofv/spack-build-sqrws2v
Start 66: external_ringtest::coreneuron_cpu_mpi_offline::preparation
Start 71: external_ringtest::coreneuron_gpu_mpi_offline::preparation
Start 78: testcorenrn_bbcore::coreneuron_gpu_offline::preparation
Start 82: testcorenrn_bbcore::coreneuron_cpu_offline::preparation
Start 88: testcorenrn_conc::coreneuron_gpu_offline::preparation
Start 92: testcorenrn_conc::coreneuron_cpu_offline::preparation
Start 98: testcorenrn_deriv::coreneuron_gpu_offline::preparation
Start 102: testcorenrn_deriv::coreneuron_cpu_offline::preparation
Start 108: testcorenrn_gf::coreneuron_gpu_offline::preparation
Start 112: testcorenrn_gf::coreneuron_cpu_offline::preparation
Start 118: testcorenrn_kin::coreneuron_gpu_offline::preparation
Start 122: testcorenrn_kin::coreneuron_cpu_offline::preparation
1/184 Test #92: testcorenrn_conc::coreneuron_cpu_offline::preparation .............. Passed 3.80 sec
Start 127: testcorenrn_patstim::coreneuron_gpu_offline::preparation
2/184 Test #82: testcorenrn_bbcore::coreneuron_cpu_offline::preparation ............ Passed 3.84 sec
Start 61: external_ringtest::neuron
3/184 Test #88: testcorenrn_conc::coreneuron_gpu_offline::preparation .............. Passed 3.96 sec
Start 74: testcorenrn_bbcore::neuron
4/184 Test #98: testcorenrn_deriv::coreneuron_gpu_offline::preparation ............. Passed 4.00 sec
Start 75: testcorenrn_bbcore::coreneuron_gpu_online
5/184 Test #122: testcorenrn_kin::coreneuron_cpu_offline::preparation ............... Passed 4.03 sec
Start 76: testcorenrn_bbcore::coreneuron_gpu_online_psolve_alternate
6/184 Test #118: testcorenrn_kin::coreneuron_gpu_offline::preparation ............... Passed 4.05 sec
Start 79: testcorenrn_bbcore::coreneuron_cpu_online
7/184 Test #78: testcorenrn_bbcore::coreneuron_gpu_offline::preparation ............ Passed 4.09 sec
Start 77: testcorenrn_bbcore::coreneuron_gpu_offline
8/184 Test #112: testcorenrn_gf::coreneuron_cpu_offline::preparation ................ Passed 4.17 sec
Start 130: testcorenrn_patstim::coreneuron_cpu_offline::preparation
9/184 Test #102: testcorenrn_deriv::coreneuron_cpu_offline::preparation ............. Passed 5.89 sec
Start 80: testcorenrn_bbcore::coreneuron_cpu_online_psolve_alternate
10/184 Test #61: external_ringtest::neuron .......................................... Passed 2.65 sec
Start 81: testcorenrn_bbcore::coreneuron_cpu_offline
11/184 Test #108: testcorenrn_gf::coreneuron_gpu_offline::preparation ................ Passed 6.52 sec
Start 136: testcorenrn_vecplay::coreneuron_gpu_offline::preparation
12/184 Test #127: testcorenrn_patstim::coreneuron_gpu_offline::preparation ........... Passed 3.00 sec
Start 84: testcorenrn_conc::neuron
13/184 Test #74: testcorenrn_bbcore::neuron ......................................... Passed 3.28 sec
Start 85: testcorenrn_conc::coreneuron_gpu_online
14/184 Test #130: testcorenrn_patstim::coreneuron_cpu_offline::preparation ........... Passed 3.52 sec
Start 140: testcorenrn_vecplay::coreneuron_cpu_offline::preparation
15/184 Test #84: testcorenrn_conc::neuron ........................................... Passed 2.11 sec
Start 86: testcorenrn_conc::coreneuron_gpu_online_psolve_alternate
16/184 Test #136: testcorenrn_vecplay::coreneuron_gpu_offline::preparation ........... Passed 2.51 sec
Start 156: testcorenrn_watch::coreneuron_gpu_offline::preparation
17/184 Test #140: testcorenrn_vecplay::coreneuron_cpu_offline::preparation ........... Passed 1.76 sec
Start 160: testcorenrn_watch::coreneuron_cpu_offline::preparation
18/184 Test #71: external_ringtest::coreneuron_gpu_mpi_offline::preparation ......... Passed 9.75 sec
Start 62: external_ringtest::neuron_mpi
19/184 Test #156: testcorenrn_watch::coreneuron_gpu_offline::preparation ............. Passed 3.41 sec
Start 63: external_ringtest::coreneuron_cpu_mpi_offline_saverestore
20/184 Test #160: testcorenrn_watch::coreneuron_cpu_offline::preparation ............. Passed 3.79 sec
Start 64: external_ringtest::coreneuron_cpu_mpi
21/184 Test #81: testcorenrn_bbcore::coreneuron_cpu_offline ......................... Passed 9.59 sec
Start 87: testcorenrn_conc::coreneuron_gpu_offline
22/184 Test #79: testcorenrn_bbcore::coreneuron_cpu_online .......................... Passed 13.01 sec
Start 89: testcorenrn_conc::coreneuron_cpu_online
23/184 Test #80: testcorenrn_bbcore::coreneuron_cpu_online_psolve_alternate ......... Passed 14.89 sec
Start 90: testcorenrn_conc::coreneuron_cpu_online_psolve_alternate
24/184 Test #62: external_ringtest::neuron_mpi ...................................... Passed 11.41 sec
Start 68: external_ringtest::coreneuron_gpu_mpi_offline_saverestore
25/184 Test #66: external_ringtest::coreneuron_cpu_mpi_offline::preparation ......... Passed 21.24 sec
Start 65: external_ringtest::coreneuron_cpu_mpi_offline
26/184 Test #77: testcorenrn_bbcore::coreneuron_gpu_offline ......................... Passed 17.21 sec
Start 91: testcorenrn_conc::coreneuron_cpu_offline
27/184 Test #75: testcorenrn_bbcore::coreneuron_gpu_online .......................... Passed 17.52 sec
Start 94: testcorenrn_deriv::neuron
28/184 Test #85: testcorenrn_conc::coreneuron_gpu_online ............................ Passed 14.35 sec
Start 95: testcorenrn_deriv::coreneuron_gpu_online
29/184 Test #76: testcorenrn_bbcore::coreneuron_gpu_online_psolve_alternate ......... Passed 17.68 sec
Start 96: testcorenrn_deriv::coreneuron_gpu_online_psolve_alternate
30/184 Test #86: testcorenrn_conc::coreneuron_gpu_online_psolve_alternate ........... Passed 14.11 sec
Start 97: testcorenrn_deriv::coreneuron_gpu_offline
31/184 Test #94: testcorenrn_deriv::neuron .......................................... Passed 2.04 sec
Start 99: testcorenrn_deriv::coreneuron_cpu_online
32/184 Test #87: testcorenrn_conc::coreneuron_gpu_offline ........................... Passed 14.78 sec
Start 100: testcorenrn_deriv::coreneuron_cpu_online_psolve_alternate
33/184 Test #89: testcorenrn_conc::coreneuron_cpu_online ............................ Passed 16.79 sec
Start 101: testcorenrn_deriv::coreneuron_cpu_offline
34/184 Test #64: external_ringtest::coreneuron_cpu_mpi .............................. Passed 20.94 sec
Start 69: external_ringtest::coreneuron_gpu_mpi
35/184 Test #91: testcorenrn_conc::coreneuron_cpu_offline ........................... Passed 16.91 sec
Start 114: testcorenrn_kin::neuron
36/184 Test #90: testcorenrn_conc::coreneuron_cpu_online_psolve_alternate ........... Passed 18.71 sec
Start 115: testcorenrn_kin::coreneuron_gpu_online
37/184 Test #114: testcorenrn_kin::neuron ............................................ Passed 1.96 sec
Start 116: testcorenrn_kin::coreneuron_gpu_online_psolve_alternate
38/184 Test #65: external_ringtest::coreneuron_cpu_mpi_offline ...................... Passed 23.27 sec
Start 70: external_ringtest::coreneuron_gpu_mpi_offline
39/184 Test #99: testcorenrn_deriv::coreneuron_cpu_online ........................... Passed 21.00 sec
Start 117: testcorenrn_kin::coreneuron_gpu_offline
40/184 Test #95: testcorenrn_deriv::coreneuron_gpu_online ........................... Passed 23.22 sec
Start 119: testcorenrn_kin::coreneuron_cpu_online
41/184 Test #97: testcorenrn_deriv::coreneuron_gpu_offline .......................... Passed 21.98 sec
Start 120: testcorenrn_kin::coreneuron_cpu_online_psolve_alternate
42/184 Test #100: testcorenrn_deriv::coreneuron_cpu_online_psolve_alternate .......... Passed 15.41 sec
Start 121: testcorenrn_kin::coreneuron_cpu_offline
43/184 Test #96: testcorenrn_deriv::coreneuron_gpu_online_psolve_alternate .......... Passed 24.61 sec
Start 125: testcorenrn_patstim::coreneuron_gpu_offline_saverestore
44/184 Test #101: testcorenrn_deriv::coreneuron_cpu_offline .......................... Passed 16.70 sec
Start 126: testcorenrn_patstim::coreneuron_gpu_offline
45/184 Test #69: external_ringtest::coreneuron_gpu_mpi .............................. Passed 18.23 sec
Start 104: testcorenrn_gf::neuron
46/184 Test #104: testcorenrn_gf::neuron ............................................. Passed 2.31 sec
Start 105: testcorenrn_gf::coreneuron_gpu_online
47/184 Test #115: testcorenrn_kin::coreneuron_gpu_online ............................. Passed 15.39 sec
Start 166: channel_benchmark_hippo::neuron
48/184 Test #117: testcorenrn_kin::coreneuron_gpu_offline ............................ Passed 18.16 sec
Start 167: channel_benchmark_hippo::coreneuron_gpu_online
49/184 Test #116: testcorenrn_kin::coreneuron_gpu_online_psolve_alternate ............ Passed 23.76 sec
Start 168: channel_benchmark_hippo::coreneuron_gpu_filemode
50/184 Test #121: testcorenrn_kin::coreneuron_cpu_offline ............................ Passed 19.06 sec
Start 169: channel_benchmark_hippo::coreneuron_cpu_online
51/184 Test #119: testcorenrn_kin::coreneuron_cpu_online ............................. Passed 20.59 sec
Start 170: channel_benchmark_hippo::coreneuron_cpu_filemode
52/184 Test #120: testcorenrn_kin::coreneuron_cpu_online_psolve_alternate ............ Passed 21.92 sec
Start 172: channel_benchmark_sscx::neuron
53/184 Test #70: external_ringtest::coreneuron_gpu_mpi_offline ...................... Passed 22.62 sec
Start 106: testcorenrn_gf::coreneuron_gpu_online_psolve_alternate
54/184 Test #126: testcorenrn_patstim::coreneuron_gpu_offline ........................ Passed 32.90 sec
Start 173: channel_benchmark_sscx::coreneuron_gpu_online
55/184 Test #105: testcorenrn_gf::coreneuron_gpu_online .............................. Passed 55.16 sec
Start 107: testcorenrn_gf::coreneuron_gpu_offline
56/184 Test #125: testcorenrn_patstim::coreneuron_gpu_offline_saverestore ............ Passed 66.15 sec
Start 174: channel_benchmark_sscx::coreneuron_gpu_filemode
57/184 Test #63: external_ringtest::coreneuron_cpu_mpi_offline_saverestore .......... Passed 113.19 sec
Start 109: testcorenrn_gf::coreneuron_cpu_online
58/184 Test #68: external_ringtest::coreneuron_gpu_mpi_offline_saverestore .......... Passed 110.36 sec
Start 110: testcorenrn_gf::coreneuron_cpu_online_psolve_alternate
59/184 Test #109: testcorenrn_gf::coreneuron_cpu_online .............................. Passed 38.86 sec
Start 111: testcorenrn_gf::coreneuron_cpu_offline
60/184 Test #107: testcorenrn_gf::coreneuron_gpu_offline ............................. Passed 55.36 sec
Start 124: testcorenrn_patstim::neuron
61/184 Test #124: testcorenrn_patstim::neuron ........................................ Passed 5.11 sec
Start 128: testcorenrn_patstim::coreneuron_cpu_offline_saverestore
62/184 Test #106: testcorenrn_gf::coreneuron_gpu_online_psolve_alternate ............. Passed 115.78 sec
Start 129: testcorenrn_patstim::coreneuron_cpu_offline
63/184 Test #110: testcorenrn_gf::coreneuron_cpu_online_psolve_alternate ............. Passed 59.92 sec
Start 132: testcorenrn_vecplay::neuron
64/184 Test #132: testcorenrn_vecplay::neuron ........................................ Passed 4.11 sec
Start 133: testcorenrn_vecplay::coreneuron_gpu_online
65/184 Test #111: testcorenrn_gf::coreneuron_cpu_offline ............................. Passed 47.38 sec
Start 134: testcorenrn_vecplay::coreneuron_gpu_online_psolve_alternate
66/184 Test #166: channel_benchmark_hippo::neuron .................................... Passed 166.80 sec
Start 175: channel_benchmark_sscx::coreneuron_cpu_online
67/184 Test #129: testcorenrn_patstim::coreneuron_cpu_offline ........................ Passed 46.56 sec
Start 135: testcorenrn_vecplay::coreneuron_gpu_offline
68/184 Test #172: channel_benchmark_sscx::neuron ..................................... Passed 166.39 sec
Start 176: channel_benchmark_sscx::coreneuron_cpu_filemode
69/184 Test #133: testcorenrn_vecplay::coreneuron_gpu_online ......................... Passed 51.81 sec
Start 137: testcorenrn_vecplay::coreneuron_cpu_online
70/184 Test #168: channel_benchmark_hippo::coreneuron_gpu_filemode ................... Passed 213.54 sec
Start 1: testneuron
71/184 Test #1: testneuron ......................................................... Passed 0.05 sec
Start 2: ringtest
72/184 Test #167: channel_benchmark_hippo::coreneuron_gpu_online ..................... Passed 214.91 sec
Start 3: connect_dend
73/184 Test #3: connect_dend ....................................................... Passed 0.26 sec
Start 4: mpi_init::nrniv_mpiopt
74/184 Test #2: ringtest ........................................................... Passed 0.47 sec
Start 5: mpi_init::nrniv_nrnmpi_init
75/184 Test #5: mpi_init::nrniv_nrnmpi_init ........................................ Passed 0.24 sec
Start 6: mpi_init::python_nrnmpi_init
76/184 Test #4: mpi_init::nrniv_mpiopt ............................................. Passed 0.97 sec
Start 7: mpi_init::python_mpienv
77/184 Test #6: mpi_init::python_nrnmpi_init ....................................... Passed 1.16 sec
Start 8: mpi_init::nrniv_mpiexec_mpiopt
78/184 Test #173: channel_benchmark_sscx::coreneuron_gpu_online ...................... Passed 196.29 sec
Start 9: mpi_init::nrniv_mpiexec_nrnmpi_init
79/184 Test #8: mpi_init::nrniv_mpiexec_mpiopt ..................................... Passed 1.60 sec
Start 10: mpi_init::python_mpiexec_nrnmpi_init
80/184 Test #9: mpi_init::nrniv_mpiexec_nrnmpi_init ................................ Passed 1.38 sec
Start 11: mpi_init::python_mpiexec_mpienv
81/184 Test #128: testcorenrn_patstim::coreneuron_cpu_offline_saverestore ............ Passed 110.95 sec
Start 138: testcorenrn_vecplay::coreneuron_cpu_online_psolve_alternate
82/184 Test #7: mpi_init::python_mpienv ............................................ Passed 3.76 sec
Start 12: pynrn::basic_tests
83/184 Test #10: mpi_init::python_mpiexec_nrnmpi_init ............................... Passed 2.11 sec
Start 13: parallel_tests
84/184 Test #137: testcorenrn_vecplay::coreneuron_cpu_online ......................... Passed 37.94 sec
Start 139: testcorenrn_vecplay::coreneuron_cpu_offline
85/184 Test #174: channel_benchmark_sscx::coreneuron_gpu_filemode .................... Passed 173.38 sec
Start 14: parallel_partrans
86/184 Test #11: mpi_init::python_mpiexec_mpienv .................................... Passed 5.86 sec
Start 15: parallel_netpar
87/184 Test #169: channel_benchmark_hippo::coreneuron_cpu_online ..................... Passed 222.10 sec
Start 16: parallel_bas
88/184 Test #170: channel_benchmark_hippo::coreneuron_cpu_filemode ................... Passed 222.41 sec
Start 17: coreneuron_modtests::version_macros
89/184 Test #135: testcorenrn_vecplay::coreneuron_gpu_offline ........................ Passed 62.17 sec
Start 152: testcorenrn_watch::neuron
90/184 Test #175: channel_benchmark_sscx::coreneuron_cpu_online ...................... Passed 72.09 sec
Start 18: coreneuron_modtests::fornetcon_py_cpu
91/184 Test #152: testcorenrn_watch::neuron .......................................... Passed 2.99 sec
Start 153: testcorenrn_watch::coreneuron_gpu_online
92/184 Test #15: parallel_netpar .................................................... Passed 8.15 sec
Start 19: coreneuron_modtests::direct_py_cpu
93/184 Test #14: parallel_partrans .................................................. Passed 10.07 sec
Start 20: coreneuron_modtests::direct_hoc_cpu
94/184 Test #13: parallel_tests ..................................................... Passed 14.25 sec
Start 21: coreneuron_modtests::spikes_py_cpu
95/184 Test #139: testcorenrn_vecplay::coreneuron_cpu_offline ........................ Passed 14.57 sec
Start 154: testcorenrn_watch::coreneuron_gpu_online_psolve_alternate
96/184 Test #16: parallel_bas ....................................................... Passed 12.72 sec
Start 22: coreneuron_modtests::spikes_file_mode_py_cpu
97/184 Test #12: pynrn::basic_tests ................................................. Passed 18.79 sec
Start 23: coreneuron_modtests::fast_imem_py_cpu
98/184 Test #23: coreneuron_modtests::fast_imem_py_cpu .............................. Passed 0.88 sec
Start 24: coreneuron_modtests::datareturn_py_cpu
99/184 Test #134: testcorenrn_vecplay::coreneuron_gpu_online_psolve_alternate ........ Passed 95.56 sec
Start 155: testcorenrn_watch::coreneuron_gpu_offline
100/184 Test #17: coreneuron_modtests::version_macros ................................ Passed 25.51 sec
Start 25: coreneuron_modtests::test_units_py_cpu
101/184 Test #138: testcorenrn_vecplay::coreneuron_cpu_online_psolve_alternate ........ Passed 33.19 sec
Start 157: testcorenrn_watch::coreneuron_cpu_online
102/184 Test #19: coreneuron_modtests::direct_py_cpu ................................. Passed 27.65 sec
Start 26: coreneuron_modtests::test_netmove_py_cpu
103/184 Test #21: coreneuron_modtests::spikes_py_cpu ................................. Passed 28.19 sec
Start 27: coreneuron_modtests::test_pointer_py_cpu
104/184 Test #176: channel_benchmark_sscx::coreneuron_cpu_filemode .................... Passed 93.14 sec
Start 28: coreneuron_modtests::test_watchrange_py_cpu
105/184 Test #18: coreneuron_modtests::fornetcon_py_cpu .............................. Passed 33.32 sec
Start 29: coreneuron_modtests::test_psolve_py_cpu
106/184 Test #22: coreneuron_modtests::spikes_file_mode_py_cpu ....................... Passed 26.82 sec
Start 30: coreneuron_modtests::test_ba_py_cpu
107/184 Test #153: testcorenrn_watch::coreneuron_gpu_online ........................... Passed 33.00 sec
Start 158: testcorenrn_watch::coreneuron_cpu_online_psolve_alternate
108/184 Test #24: coreneuron_modtests::datareturn_py_cpu ............................. Passed 25.51 sec
Start 31: coreneuron_modtests::test_natrans_py_cpu
109/184 Test #20: coreneuron_modtests::direct_hoc_cpu ................................ Passed 34.20 sec
Start 35: coreneuron_modtests::fornetcon_py_gpu
110/184 Test #154: testcorenrn_watch::coreneuron_gpu_online_psolve_alternate .......... Passed 39.67 sec
Start 159: testcorenrn_watch::coreneuron_cpu_offline
111/184 Test #155: testcorenrn_watch::coreneuron_gpu_offline .......................... Passed 32.13 sec
Start 162: testcorenrn_netstimdirect::direct_netstimdirect
112/184 Test #25: coreneuron_modtests::test_units_py_cpu ............................. Passed 30.12 sec
Start 36: coreneuron_modtests::direct_py_gpu
113/184 Test #30: coreneuron_modtests::test_ba_py_cpu ................................ Passed 22.41 sec
Start 37: coreneuron_modtests::direct_hoc_gpu
114/184 Test #28: coreneuron_modtests::test_watchrange_py_cpu ........................ Passed 23.18 sec
Start 38: coreneuron_modtests::spikes_py_gpu
115/184 Test #29: coreneuron_modtests::test_psolve_py_cpu ............................ Passed 22.58 sec
Start 39: coreneuron_modtests::spikes_file_mode_py_gpu
116/184 Test #26: coreneuron_modtests::test_netmove_py_cpu ........................... Passed 26.92 sec
Start 40: coreneuron_modtests::fast_imem_py_gpu
117/184 Test #40: coreneuron_modtests::fast_imem_py_gpu .............................. Passed 1.27 sec
Start 41: coreneuron_modtests::datareturn_py_gpu
118/184 Test #31: coreneuron_modtests::test_natrans_py_cpu ........................... Passed 25.13 sec
Start 42: coreneuron_modtests::test_units_py_gpu
119/184 Test #35: coreneuron_modtests::fornetcon_py_gpu .............................. Passed 23.66 sec
Start 43: coreneuron_modtests::test_netmove_py_gpu
120/184 Test #158: testcorenrn_watch::coreneuron_cpu_online_psolve_alternate .......... Passed 26.48 sec
Start 163: testcorenrn_netstimdirect::direct_netstimdirect_psolve_alternate
121/184 Test #157: testcorenrn_watch::coreneuron_cpu_online ........................... Passed 51.13 sec
Start 32: coreneuron_modtests::spikes_mpi_py_cpu
122/184 Test #162: testcorenrn_netstimdirect::direct_netstimdirect .................... Passed 29.96 sec
Start 33: coreneuron_modtests::spikes_mpi_file_mode_py_cpu
123/184 Test #159: testcorenrn_watch::coreneuron_cpu_offline .......................... Passed 31.38 sec
Start 34: coreneuron_modtests::inputpresyn_py_cpu
124/184 Test #36: coreneuron_modtests::direct_py_gpu ................................. Passed 31.23 sec
Start 44: coreneuron_modtests::test_pointer_py_gpu
125/184 Test #37: coreneuron_modtests::direct_hoc_gpu ................................ Passed 27.21 sec
Start 45: coreneuron_modtests::test_watchrange_py_gpu
126/184 Test #39: coreneuron_modtests::spikes_file_mode_py_gpu ....................... Passed 27.27 sec
Start 46: coreneuron_modtests::test_psolve_py_gpu
127/184 Test #38: coreneuron_modtests::spikes_py_gpu ................................. Passed 27.34 sec
Start 47: coreneuron_modtests::test_ba_py_gpu
128/184 Test #41: coreneuron_modtests::datareturn_py_gpu ............................. Passed 30.61 sec
Start 48: coreneuron_modtests::test_natrans_py_gpu
129/184 Test #42: coreneuron_modtests::test_units_py_gpu ............................. Passed 31.46 sec
Start 52: modlunit_unitstest
130/184 Test #52: modlunit_unitstest ................................................. Passed 0.02 sec
Start 53: modlunit_hh
131/184 Test #53: modlunit_hh ........................................................ Passed 0.01 sec
Start 54: modlunit_stim
132/184 Test #54: modlunit_stim ...................................................... Passed 0.01 sec
Start 55: modlunit_pattern
133/184 Test #55: modlunit_pattern ................................................... Passed 0.01 sec
Start 56: external_nrntest
134/184 Test #163: testcorenrn_netstimdirect::direct_netstimdirect_psolve_alternate ... Passed 32.92 sec
Start 49: coreneuron_modtests::spikes_mpi_py_gpu
135/184 Test #43: coreneuron_modtests::test_netmove_py_gpu ........................... Passed 34.07 sec
Start 83: testcorenrn_bbcore::compare_results
136/184 Test #83: testcorenrn_bbcore::compare_results ................................ Passed 0.06 sec
Start 93: testcorenrn_conc::compare_results
137/184 Test #93: testcorenrn_conc::compare_results .................................. Passed 0.05 sec
Start 103: testcorenrn_deriv::compare_results
138/184 Test #103: testcorenrn_deriv::compare_results ................................. Passed 0.05 sec
Start 113: testcorenrn_gf::compare_results
139/184 Test #113: testcorenrn_gf::compare_results .................................... Passed 0.07 sec
Start 123: testcorenrn_kin::compare_results
140/184 Test #123: testcorenrn_kin::compare_results ................................... Passed 0.07 sec
Start 131: testcorenrn_patstim::compare_results
141/184 Test #131: testcorenrn_patstim::compare_results ............................... Passed 0.06 sec
Start 141: testcorenrn_vecplay::compare_results
142/184 Test #141: testcorenrn_vecplay::compare_results ............................... Passed 0.06 sec
Start 161: testcorenrn_watch::compare_results
143/184 Test #161: testcorenrn_watch::compare_results ................................. Passed 0.06 sec
Start 164: testcorenrn_netstimdirect::compare_results
144/184 Test #164: testcorenrn_netstimdirect::compare_results ......................... Passed 0.06 sec
Start 171: channel_benchmark_hippo::compare_results
145/184 Test #171: channel_benchmark_hippo::compare_results ........................... Passed 0.06 sec
Start 177: channel_benchmark_sscx::compare_results
146/184 Test #177: channel_benchmark_sscx::compare_results ............................ Passed 0.05 sec
147/184 Test #32: coreneuron_modtests::spikes_mpi_py_cpu ............................. Passed 25.68 sec
Start 50: coreneuron_modtests::spikes_mpi_file_mode_py_gpu
148/184 Test #33: coreneuron_modtests::spikes_mpi_file_mode_py_cpu ................... Passed 24.20 sec
Start 51: coreneuron_modtests::inputpresyn_py_gpu
149/184 Test #34: coreneuron_modtests::inputpresyn_py_cpu ............................ Passed 23.06 sec
150/184 Test #47: coreneuron_modtests::test_ba_py_gpu ................................ Passed 24.50 sec
Start 146: testcorenrn_vecevent::coreneuron_gpu_offline::preparation
151/184 Test #48: coreneuron_modtests::test_natrans_py_gpu ........................... Passed 19.88 sec
152/184 Test #45: coreneuron_modtests::test_watchrange_py_gpu ........................ Passed 24.81 sec
153/184 Test #46: coreneuron_modtests::test_psolve_py_gpu ............................ Passed 24.76 sec
154/184 Test #146: testcorenrn_vecevent::coreneuron_gpu_offline::preparation .......... Passed 2.26 sec
Start 150: testcorenrn_vecevent::coreneuron_cpu_offline::preparation
155/184 Test #150: testcorenrn_vecevent::coreneuron_cpu_offline::preparation .......... Passed 3.30 sec
Start 179: olfactory-bulb-3d::neuron::preparation
156/184 Test #179: olfactory-bulb-3d::neuron::preparation ............................. Passed 0.02 sec
Start 181: olfactory-bulb-3d::coreneuron_gpu_online::preparation
157/184 Test #181: olfactory-bulb-3d::coreneuron_gpu_online::preparation .............. Passed 0.02 sec
Start 183: olfactory-bulb-3d::coreneuron_cpu_online::preparation
158/184 Test #183: olfactory-bulb-3d::coreneuron_cpu_online::preparation .............. Passed 0.01 sec
Start 57: reduced_dentate::neuron
159/184 Test #49: coreneuron_modtests::spikes_mpi_py_gpu ............................. Passed 20.35 sec
Start 58: reduced_dentate::coreneuron_cpu
160/184 Test #50: coreneuron_modtests::spikes_mpi_file_mode_py_gpu ................... Passed 16.75 sec
161/184 Test #51: coreneuron_modtests::inputpresyn_py_gpu ............................ Passed 14.54 sec
Start 59: reduced_dentate::coreneuron_gpu
162/184 Test #27: coreneuron_modtests::test_pointer_py_cpu ........................... Passed 90.45 sec
163/184 Test #57: reduced_dentate::neuron ............................................ Passed 26.42 sec
Start 67: external_ringtest::coreneuron_cpu_mpi_threads
164/184 Test #44: coreneuron_modtests::test_pointer_py_gpu ........................... Passed 81.90 sec
165/184 Test #67: external_ringtest::coreneuron_cpu_mpi_threads ...................... Passed 31.67 sec
Start 72: external_ringtest::coreneuron_gpu_mpi_threads
166/184 Test #58: reduced_dentate::coreneuron_cpu .................................... Passed 71.36 sec
Start 142: testcorenrn_vecevent::neuron
167/184 Test #59: reduced_dentate::coreneuron_gpu .................................... Passed 70.60 sec
Start 143: testcorenrn_vecevent::coreneuron_gpu_online
Start 60: reduced_dentate::compare_results
168/184 Test #60: reduced_dentate::compare_results ................................... Passed 0.06 sec
169/184 Test #142: testcorenrn_vecevent::neuron ....................................... Passed 2.59 sec
Start 144: testcorenrn_vecevent::coreneuron_gpu_online_psolve_alternate
170/184 Test #143: testcorenrn_vecevent::coreneuron_gpu_online ........................ Passed 9.15 sec
Start 145: testcorenrn_vecevent::coreneuron_gpu_offline
171/184 Test #72: external_ringtest::coreneuron_gpu_mpi_threads ...................... Passed 23.45 sec
Start 147: testcorenrn_vecevent::coreneuron_cpu_online
Start 73: external_ringtest::compare_results
172/184 Test #73: external_ringtest::compare_results ................................. Passed 0.07 sec
173/184 Test #144: testcorenrn_vecevent::coreneuron_gpu_online_psolve_alternate ....... Passed 14.26 sec
Start 148: testcorenrn_vecevent::coreneuron_cpu_online_psolve_alternate
174/184 Test #145: testcorenrn_vecevent::coreneuron_gpu_offline ....................... Passed 12.21 sec
Start 149: testcorenrn_vecevent::coreneuron_cpu_offline
175/184 Test #147: testcorenrn_vecevent::coreneuron_cpu_online ........................ Passed 11.99 sec
Start 178: olfactory-bulb-3d::neuron
176/184 Test #148: testcorenrn_vecevent::coreneuron_cpu_online_psolve_alternate ....... Passed 6.99 sec
Start 180: olfactory-bulb-3d::coreneuron_gpu_online
177/184 Test #56: external_nrntest ................................................... Passed 119.50 sec
Start 182: olfactory-bulb-3d::coreneuron_cpu_online
178/184 Test #149: testcorenrn_vecevent::coreneuron_cpu_offline ....................... Passed 8.95 sec
Start 151: testcorenrn_vecevent::compare_results
179/184 Test #151: testcorenrn_vecevent::compare_results .............................. Passed 0.05 sec
180/184 Test #178: olfactory-bulb-3d::neuron .......................................... Passed 143.36 sec
181/184 Test #180: olfactory-bulb-3d::coreneuron_gpu_online ........................... Passed 154.67 sec
182/184 Test #182: olfactory-bulb-3d::coreneuron_cpu_online ........................... Passed 162.44 sec
Start 165: tqperf::coreneuron
183/184 Test #165: tqperf::coreneuron ................................................. Passed 53.49 sec
Start 184: olfactory-bulb-3d::compare_results
184/184 Test #184: olfactory-bulb-3d::compare_results ................................. Passed 0.23 sec
100% tests passed, 0 tests failed out of 184
Total Test time (real) = 720.50 sec
$ cp -r Testing/ ${CI_PROJECT_DIR}/
$ module load unstable unit-test-translator
Autoloading python/3.9.7
$ cmake2junit > ${CI_PROJECT_DIR}/ctest.xml
$ exit ${i_am_a_failure}
section_end:1649845572:step_script section_start:1649845572:upload_artifacts_on_success Uploading artifacts for successful job
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=60249 revision=58ba2b95 version=14.2.0
initial_environment.env: found 1 matching files and directories
Testing/: found 7 matching files and directories 
Uploading artifacts as "archive" to coordinator... ok id=211036 responseStatus=201 Created token=tMX5EYMs
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=60313 revision=58ba2b95 version=14.2.0
ctest.xml: found 1 matching files and directories 
Uploading artifacts as "junit" to coordinator... ok id=211036 responseStatus=201 Created token=tMX5EYMs
section_end:1649845574:upload_artifacts_on_success section_start:1649845574:cleanup_file_variables Cleaning up project directory and file based variables
section_end:1649845575:cleanup_file_variables Job succeeded
Running with gitlab-runner 14.3.2 (e0218c92)
 on BB5 map runner pnPo3yJy
section_start:1649845347:resolve_secrets Resolving secrets
section_end:1649845347:resolve_secrets section_start:1649845347:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 14.3.2, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor779252517, slurm job id
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 48472, build ref 9a466a6e2a1c807e4d6045f3fc0b1fbaf51370b6, job ID 211040
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472, optional exclusive flag , optional cpus per task flag --cpus-per-task=1, optional qos flag
A slurm job will be created with name GL_J211040_PROD_P112_CP2_C7
Job parameters: memory=76G, cpus_per_task=1, duration=1:00:00, constraint=cpu ntasks=16 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
sbatch: INFO: Job specifies cpu constraint, setting --constraint=[skl|clx]
Submitted batch job 379434
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=16 --cpus-per-task=1 --mem=76G --job-name=GL_J211040_PROD_P112_CP2_C7 -C cpu --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472 --ntasks=16 --jobid=379434 --cpus-per-task=1 --mem=76G
section_end:1649845348:prepare_executor section_start:1649845348:prepare_script Preparing environment
Running on r1i7n21 via bbpv1.epfl.ch...
section_end:1649845351:prepare_script section_start:1649845351:get_sources Getting source from Git repository
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1649845353:get_sources section_start:1649845353:download_artifacts Downloading artifacts
Downloading artifacts for build:neuron:nmodl:intel (211029)...
Runtime platform  arch=amd64 os=linux pid=60749 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=211029 responseStatus=200 OK token=zngzupmC
section_end:1649845354:download_artifacts section_start:1649845354:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ export CTEST_PARALLEL_LEVEL=${SLURM_TASKS_PER_NODE}
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ cd ${SPACK_BUILD_DIR}
$ export BOOST_TEST_COLOR_OUTPUT=no
$ i_am_a_failure=0
$ spack build-env ${SPACK_FULL_SPEC} -- ctest --output-on-failure -T Test || i_am_a_failure=1
Site: r1i4n0
Build name: Linux-icpc
Create new tag: 20220413-1023 - Experimental
Test project /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211029/spack-build/spack-stage-neuron-develop-4ljxz6sigfw2sqg4xtopvwa3qqcjfsk4/spack-build-4ljxz6s
Start 48: external_ringtest::coreneuron_cpu_mpi_offline::preparation
Start 55: testcorenrn_bbcore::coreneuron_cpu_offline::preparation
Start 61: testcorenrn_conc::coreneuron_cpu_offline::preparation
Start 67: testcorenrn_deriv::coreneuron_cpu_offline::preparation
Start 73: testcorenrn_gf::coreneuron_cpu_offline::preparation
Start 79: testcorenrn_kin::coreneuron_cpu_offline::preparation
Start 84: testcorenrn_patstim::coreneuron_cpu_offline::preparation
Start 90: testcorenrn_vecplay::coreneuron_cpu_offline::preparation
Start 96: testcorenrn_vecevent::coreneuron_cpu_offline::preparation
1/119 Test #61: testcorenrn_conc::coreneuron_cpu_offline::preparation .............. Passed 8.57 sec
Start 43: external_ringtest::neuron
2/119 Test #73: testcorenrn_gf::coreneuron_cpu_offline::preparation ................ Passed 9.28 sec
Start 102: testcorenrn_watch::coreneuron_cpu_offline::preparation
3/119 Test #84: testcorenrn_patstim::coreneuron_cpu_offline::preparation ........... Passed 10.56 sec
Start 44: external_ringtest::neuron_mpi
4/119 Test #55: testcorenrn_bbcore::coreneuron_cpu_offline::preparation ............ Passed 10.76 sec
Start 51: testcorenrn_bbcore::neuron
5/119 Test #90: testcorenrn_vecplay::coreneuron_cpu_offline::preparation ........... Passed 11.28 sec
Start 45: external_ringtest::coreneuron_cpu_mpi_offline_saverestore
6/119 Test #79: testcorenrn_kin::coreneuron_cpu_offline::preparation ............... Passed 11.38 sec
Start 52: testcorenrn_bbcore::coreneuron_cpu_online
7/119 Test #96: testcorenrn_vecevent::coreneuron_cpu_offline::preparation .......... Passed 11.74 sec
Start 116: olfactory-bulb-3d::neuron::preparation
8/119 Test #116: olfactory-bulb-3d::neuron::preparation ............................. Passed 0.02 sec
Start 118: olfactory-bulb-3d::coreneuron_cpu_online::preparation
9/119 Test #118: olfactory-bulb-3d::coreneuron_cpu_online::preparation .............. Passed 0.20 sec
Start 40: reduced_dentate::neuron
10/119 Test #67: testcorenrn_deriv::coreneuron_cpu_offline::preparation ............. Passed 12.98 sec
Start 53: testcorenrn_bbcore::coreneuron_cpu_online_psolve_alternate
11/119 Test #51: testcorenrn_bbcore::neuron ......................................... Passed 6.77 sec
Start 54: testcorenrn_bbcore::coreneuron_cpu_offline
12/119 Test #102: testcorenrn_watch::coreneuron_cpu_offline::preparation ............. Passed 8.56 sec
Start 46: external_ringtest::coreneuron_cpu_mpi
13/119 Test #43: external_ringtest::neuron .......................................... Passed 12.69 sec
Start 57: testcorenrn_conc::neuron
14/119 Test #48: external_ringtest::coreneuron_cpu_mpi_offline::preparation ......... Passed 21.65 sec
Start 47: external_ringtest::coreneuron_cpu_mpi_offline
15/119 Test #54: testcorenrn_bbcore::coreneuron_cpu_offline ......................... Passed 5.09 sec
Start 58: testcorenrn_conc::coreneuron_cpu_online
16/119 Test #44: external_ringtest::neuron_mpi ...................................... Passed 15.71 sec
Start 59: testcorenrn_conc::coreneuron_cpu_online_psolve_alternate
Start 60: testcorenrn_conc::coreneuron_cpu_offline
17/119 Test #60: testcorenrn_conc::coreneuron_cpu_offline ........................... Passed 5.56 sec
Start 63: testcorenrn_deriv::neuron
18/119 Test #47: external_ringtest::coreneuron_cpu_mpi_offline ...................... Passed 10.90 sec
Start 64: testcorenrn_deriv::coreneuron_cpu_online
Start 65: testcorenrn_deriv::coreneuron_cpu_online_psolve_alternate
19/119 Test #52: testcorenrn_bbcore::coreneuron_cpu_online .......................... Passed 24.96 sec
Start 66: testcorenrn_deriv::coreneuron_cpu_offline
20/119 Test #57: testcorenrn_conc::neuron ........................................... Passed 16.65 sec
Start 75: testcorenrn_kin::neuron
21/119 Test #40: reduced_dentate::neuron ............................................ Passed 29.67 sec
Start 41: reduced_dentate::coreneuron_cpu
22/119 Test #63: testcorenrn_deriv::neuron .......................................... Passed 9.80 sec
Start 76: testcorenrn_kin::coreneuron_cpu_online
23/119 Test #66: testcorenrn_deriv::coreneuron_cpu_offline .......................... Passed 5.33 sec
Start 77: testcorenrn_kin::coreneuron_cpu_online_psolve_alternate
24/119 Test #53: testcorenrn_bbcore::coreneuron_cpu_online_psolve_alternate ......... Passed 30.70 sec
Start 78: testcorenrn_kin::coreneuron_cpu_offline
25/119 Test #46: external_ringtest::coreneuron_cpu_mpi .............................. Passed 26.37 sec
Start 69: testcorenrn_gf::neuron
26/119 Test #59: testcorenrn_conc::coreneuron_cpu_online_psolve_alternate ........... Passed 17.86 sec
Start 107: channel_benchmark_hippo::neuron
27/119 Test #75: testcorenrn_kin::neuron ............................................ Passed 8.93 sec
Start 108: channel_benchmark_hippo::coreneuron_cpu_online
28/119 Test #65: testcorenrn_deriv::coreneuron_cpu_online_psolve_alternate .......... Passed 14.29 sec
Start 109: channel_benchmark_hippo::coreneuron_cpu_filemode
29/119 Test #58: testcorenrn_conc::coreneuron_cpu_online ............................ Passed 25.98 sec
Start 111: channel_benchmark_sscx::neuron
30/119 Test #64: testcorenrn_deriv::coreneuron_cpu_online ........................... Passed 15.75 sec
Start 112: channel_benchmark_sscx::coreneuron_cpu_online
31/119 Test #78: testcorenrn_kin::coreneuron_cpu_offline ............................ Passed 6.20 sec
Start 113: channel_benchmark_sscx::coreneuron_cpu_filemode
32/119 Test #45: external_ringtest::coreneuron_cpu_mpi_offline_saverestore .......... Passed 46.02 sec
Start 70: testcorenrn_gf::coreneuron_cpu_online
33/119 Test #76: testcorenrn_kin::coreneuron_cpu_online ............................. Passed 17.88 sec
Start 1: testneuron
34/119 Test #1: testneuron ......................................................... Passed 0.17 sec
Start 2: ringtest
35/119 Test #2: ringtest ........................................................... Passed 0.69 sec
Start 3: connect_dend
36/119 Test #3: connect_dend ....................................................... Passed 0.18 sec
Start 4: mpi_init::nrniv_mpiopt
37/119 Test #4: mpi_init::nrniv_mpiopt ............................................. Passed 0.70 sec
Start 5: mpi_init::nrniv_nrnmpi_init
38/119 Test #77: testcorenrn_kin::coreneuron_cpu_online_psolve_alternate ............ Passed 19.07 sec
Start 6: mpi_init::python_nrnmpi_init
39/119 Test #6: mpi_init::python_nrnmpi_init ....................................... Passed 1.44 sec
Start 7: mpi_init::python_mpienv
40/119 Test #5: mpi_init::nrniv_nrnmpi_init ........................................ Passed 2.06 sec
Start 8: mpi_init::nrniv_mpiexec_mpiopt
41/119 Test #69: testcorenrn_gf::neuron ............................................. Passed 20.46 sec
Start 71: testcorenrn_gf::coreneuron_cpu_online_psolve_alternate
42/119 Test #8: mpi_init::nrniv_mpiexec_mpiopt ..................................... Passed 2.45 sec
Start 9: mpi_init::nrniv_mpiexec_nrnmpi_init
43/119 Test #70: testcorenrn_gf::coreneuron_cpu_online .............................. Passed 17.88 sec
Start 72: testcorenrn_gf::coreneuron_cpu_offline
44/119 Test #7: mpi_init::python_mpienv ............................................ Passed 16.60 sec
Start 10: mpi_init::python_mpiexec_nrnmpi_init
45/119 Test #9: mpi_init::nrniv_mpiexec_nrnmpi_init ................................ Passed 15.60 sec
Start 11: mpi_init::python_mpiexec_mpienv
46/119 Test #72: testcorenrn_gf::coreneuron_cpu_offline ............................. Passed 15.57 sec
Start 81: testcorenrn_patstim::neuron
47/119 Test #10: mpi_init::python_mpiexec_nrnmpi_init ............................... Passed 26.13 sec
Start 12: pynrn::basic_tests
48/119 Test #81: testcorenrn_patstim::neuron ........................................ Passed 26.13 sec
Start 82: testcorenrn_patstim::coreneuron_cpu_offline_saverestore
49/119 Test #71: testcorenrn_gf::coreneuron_cpu_online_psolve_alternate ............. Passed 52.73 sec
Start 83: testcorenrn_patstim::coreneuron_cpu_offline
50/119 Test #11: mpi_init::python_mpiexec_mpienv .................................... Passed 47.09 sec
Start 13: parallel_tests
51/119 Test #83: testcorenrn_patstim::coreneuron_cpu_offline ........................ Passed 11.67 sec
Start 86: testcorenrn_vecplay::neuron
52/119 Test #107: channel_benchmark_hippo::neuron .................................... Passed 90.40 sec
Start 14: parallel_partrans
53/119 Test #86: testcorenrn_vecplay::neuron ........................................ Passed 25.38 sec
Start 87: testcorenrn_vecplay::coreneuron_cpu_online
54/119 Test #82: testcorenrn_patstim::coreneuron_cpu_offline_saverestore ............ Passed 47.11 sec
Start 88: testcorenrn_vecplay::coreneuron_cpu_online_psolve_alternate
55/119 Test #14: parallel_partrans .................................................. Passed 30.61 sec
Start 15: parallel_netpar
56/119 Test #13: parallel_tests ..................................................... Passed 39.23 sec
Start 16: parallel_bas
57/119 Test #12: pynrn::basic_tests ................................................. Passed 63.09 sec
Start 17: coreneuron_modtests::version_macros
58/119 Test #87: testcorenrn_vecplay::coreneuron_cpu_online ......................... Passed 15.68 sec
Start 89: testcorenrn_vecplay::coreneuron_cpu_offline
59/119 Test #15: parallel_netpar .................................................... Passed 10.04 sec
Start 18: coreneuron_modtests::fornetcon_py_cpu
60/119 Test #89: testcorenrn_vecplay::coreneuron_cpu_offline ........................ Passed 6.97 sec
Start 98: testcorenrn_watch::neuron
61/119 Test #98: testcorenrn_watch::neuron .......................................... Passed 5.27 sec
Start 99: testcorenrn_watch::coreneuron_cpu_online
62/119 Test #88: testcorenrn_vecplay::coreneuron_cpu_online_psolve_alternate ........ Passed 18.31 sec
Start 100: testcorenrn_watch::coreneuron_cpu_online_psolve_alternate
63/119 Test #17: coreneuron_modtests::version_macros ................................ Passed 18.53 sec
Start 19: coreneuron_modtests::direct_py_cpu
64/119 Test #99: testcorenrn_watch::coreneuron_cpu_online ........................... Passed 9.70 sec
Start 101: testcorenrn_watch::coreneuron_cpu_offline
65/119 Test #101: testcorenrn_watch::coreneuron_cpu_offline .......................... Passed 5.36 sec
Start 104: testcorenrn_netstimdirect::direct_netstimdirect
66/119 Test #100: testcorenrn_watch::coreneuron_cpu_online_psolve_alternate .......... Passed 17.02 sec
Start 105: testcorenrn_netstimdirect::direct_netstimdirect_psolve_alternate
67/119 Test #19: coreneuron_modtests::direct_py_cpu ................................. Passed 15.21 sec
Start 20: coreneuron_modtests::direct_hoc_cpu
68/119 Test #16: parallel_bas ....................................................... Passed 37.76 sec
Start 21: coreneuron_modtests::spikes_py_cpu
69/119 Test #20: coreneuron_modtests::direct_hoc_cpu ................................ Passed 3.74 sec
Start 22: coreneuron_modtests::spikes_file_mode_py_cpu
70/119 Test #104: testcorenrn_netstimdirect::direct_netstimdirect .................... Passed 9.09 sec
Start 23: coreneuron_modtests::fast_imem_py_cpu
Start 24: coreneuron_modtests::datareturn_py_cpu
71/119 Test #105: testcorenrn_netstimdirect::direct_netstimdirect_psolve_alternate ... Passed 11.40 sec
Start 25: coreneuron_modtests::test_units_py_cpu
Start 26: coreneuron_modtests::test_netmove_py_cpu
72/119 Test #112: channel_benchmark_sscx::coreneuron_cpu_online ...................... Passed 171.52 sec
Start 27: coreneuron_modtests::test_pointer_py_cpu
73/119 Test #109: channel_benchmark_hippo::coreneuron_cpu_filemode ................... Passed 181.20 sec
Start 28: coreneuron_modtests::test_watchrange_py_cpu
74/119 Test #113: channel_benchmark_sscx::coreneuron_cpu_filemode .................... Passed 178.78 sec
Start 29: coreneuron_modtests::test_psolve_py_cpu
75/119 Test #108: channel_benchmark_hippo::coreneuron_cpu_online ..................... Passed 184.90 sec
Start 30: coreneuron_modtests::test_ba_py_cpu
76/119 Test #111: channel_benchmark_sscx::neuron ..................................... Passed 184.04 sec
Start 31: coreneuron_modtests::test_natrans_py_cpu
77/119 Test #21: coreneuron_modtests::spikes_py_cpu ................................. Passed 38.77 sec
Start 35: modlunit_unitstest
78/119 Test #22: coreneuron_modtests::spikes_file_mode_py_cpu ....................... Passed 38.19 sec
Start 36: modlunit_hh
79/119 Test #35: modlunit_unitstest ................................................. Passed 0.02 sec
Start 37: modlunit_stim
80/119 Test #36: modlunit_hh ........................................................ Passed 0.02 sec
Start 38: modlunit_pattern
81/119 Test #37: modlunit_stim ...................................................... Passed 0.01 sec
Start 39: external_nrntest
82/119 Test #38: modlunit_pattern ................................................... Passed 0.01 sec
Start 56: testcorenrn_bbcore::compare_results
83/119 Test #41: reduced_dentate::coreneuron_cpu .................................... Passed 203.01 sec
Start 92: testcorenrn_vecevent::neuron
84/119 Test #56: testcorenrn_bbcore::compare_results ................................ Passed 0.31 sec
Start 42: reduced_dentate::compare_results
85/119 Test #42: reduced_dentate::compare_results ................................... Passed 0.14 sec
Start 62: testcorenrn_conc::compare_results
86/119 Test #23: coreneuron_modtests::fast_imem_py_cpu .............................. Passed 38.43 sec
Start 68: testcorenrn_deriv::compare_results
87/119 Test #62: testcorenrn_conc::compare_results .................................. Passed 0.65 sec
Start 74: testcorenrn_gf::compare_results
88/119 Test #25: coreneuron_modtests::test_units_py_cpu ............................. Passed 34.81 sec
Start 80: testcorenrn_kin::compare_results
89/119 Test #68: testcorenrn_deriv::compare_results ................................. Passed 0.83 sec
Start 85: testcorenrn_patstim::compare_results
90/119 Test #74: testcorenrn_gf::compare_results .................................... Passed 0.83 sec
Start 91: testcorenrn_vecplay::compare_results
91/119 Test #80: testcorenrn_kin::compare_results ................................... Passed 0.88 sec
Start 103: testcorenrn_watch::compare_results
92/119 Test #24: coreneuron_modtests::datareturn_py_cpu ............................. Passed 40.71 sec
Start 106: testcorenrn_netstimdirect::compare_results
93/119 Test #26: coreneuron_modtests::test_netmove_py_cpu ........................... Passed 36.41 sec
Start 110: channel_benchmark_hippo::compare_results
94/119 Test #85: testcorenrn_patstim::compare_results ............................... Passed 2.09 sec
Start 114: channel_benchmark_sscx::compare_results
95/119 Test #28: coreneuron_modtests::test_watchrange_py_cpu ........................ Passed 19.49 sec
96/119 Test #91: testcorenrn_vecplay::compare_results ............................... Passed 1.89 sec
Start 32: coreneuron_modtests::spikes_mpi_py_cpu
97/119 Test #110: channel_benchmark_hippo::compare_results ........................... Passed 1.22 sec
98/119 Test #29: coreneuron_modtests::test_psolve_py_cpu ............................ Passed 20.12 sec
Start 33: coreneuron_modtests::spikes_mpi_file_mode_py_cpu
99/119 Test #103: testcorenrn_watch::compare_results ................................. Passed 2.22 sec
100/119 Test #30: coreneuron_modtests::test_ba_py_cpu ................................ Passed 17.10 sec
Start 34: coreneuron_modtests::inputpresyn_py_cpu
101/119 Test #31: coreneuron_modtests::test_natrans_py_cpu ........................... Passed 17.09 sec
102/119 Test #114: channel_benchmark_sscx::compare_results ............................ Passed 1.78 sec
103/119 Test #106: testcorenrn_netstimdirect::compare_results ......................... Passed 2.51 sec
104/119 Test #92: testcorenrn_vecevent::neuron ....................................... Passed 7.11 sec
Start 49: external_ringtest::coreneuron_cpu_mpi_threads
105/119 Test #27: coreneuron_modtests::test_pointer_py_cpu ........................... Passed 33.65 sec
106/119 Test #34: coreneuron_modtests::inputpresyn_py_cpu ............................ Passed 5.89 sec
Start 93: testcorenrn_vecevent::coreneuron_cpu_online
107/119 Test #33: coreneuron_modtests::spikes_mpi_file_mode_py_cpu ................... Passed 6.05 sec
108/119 Test #32: coreneuron_modtests::spikes_mpi_py_cpu ............................. Passed 6.84 sec
Start 94: testcorenrn_vecevent::coreneuron_cpu_online_psolve_alternate
109/119 Test #18: coreneuron_modtests::fornetcon_py_cpu .............................. Passed 81.00 sec
110/119 Test #49: external_ringtest::coreneuron_cpu_mpi_threads ...................... Passed 5.14 sec
Start 95: testcorenrn_vecevent::coreneuron_cpu_offline
Start 50: external_ringtest::compare_results
111/119 Test #50: external_ringtest::compare_results ................................. Passed 1.26 sec
112/119 Test #95: testcorenrn_vecevent::coreneuron_cpu_offline ....................... Passed 7.59 sec
Start 115: olfactory-bulb-3d::neuron
113/119 Test #93: testcorenrn_vecevent::coreneuron_cpu_online ........................ Passed 18.74 sec
Start 117: olfactory-bulb-3d::coreneuron_cpu_online
114/119 Test #94: testcorenrn_vecevent::coreneuron_cpu_online_psolve_alternate ....... Passed 19.54 sec
Start 97: testcorenrn_vecevent::compare_results
115/119 Test #97: testcorenrn_vecevent::compare_results .............................. Passed 0.40 sec
116/119 Test #115: olfactory-bulb-3d::neuron .......................................... Passed 186.90 sec
117/119 Test #117: olfactory-bulb-3d::coreneuron_cpu_online ........................... Passed 187.84 sec
Start 119: olfactory-bulb-3d::compare_results
118/119 Test #119: olfactory-bulb-3d::compare_results ................................. Passed 0.20 sec
119/119 Test #39: external_nrntest ................................................... Passed 265.20 sec
100% tests passed, 0 tests failed out of 119
Total Test time (real) = 509.98 sec
$ cp -r Testing/ ${CI_PROJECT_DIR}/
$ module load unstable unit-test-translator
Autoloading python/3.9.7
$ cmake2junit > ${CI_PROJECT_DIR}/ctest.xml
$ exit ${i_am_a_failure}
section_end:1649845905:step_script section_start:1649845905:upload_artifacts_on_success Uploading artifacts for successful job
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=74299 revision=58ba2b95 version=14.2.0
initial_environment.env: found 1 matching files and directories
Testing/: found 7 matching files and directories 
Uploading artifacts as "archive" to coordinator... ok id=211040 responseStatus=201 Created token=3WX5bxxu
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=74353 revision=58ba2b95 version=14.2.0
ctest.xml: found 1 matching files and directories 
Uploading artifacts as "junit" to coordinator... ok id=211040 responseStatus=201 Created token=3WX5bxxu
section_end:1649845906:upload_artifacts_on_success section_start:1649845906:cleanup_file_variables Cleaning up project directory and file based variables
section_end:1649845907:cleanup_file_variables Job succeeded
Running with gitlab-runner 14.3.2 (e0218c92)
 on BB5 map runner pnPo3yJy
section_start:1649845872:resolve_secrets Resolving secrets
section_end:1649845872:resolve_secrets section_start:1649845872:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 14.3.2, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor406661503, slurm job id
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 48472, build ref 9a466a6e2a1c807e4d6045f3fc0b1fbaf51370b6, job ID 211038
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472, optional exclusive flag , optional cpus per task flag --cpus-per-task=1, optional qos flag
A slurm job will be created with name GL_J211038_PROD_P112_CP3_C2
Job parameters: memory=76G, cpus_per_task=1, duration=1:00:00, constraint=volta ntasks=16 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
Submitted batch job 379531
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=16 --cpus-per-task=1 --mem=76G --job-name=GL_J211038_PROD_P112_CP3_C2 -C volta --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472 --ntasks=16 --jobid=379531 --cpus-per-task=1 --mem=76G
section_end:1649845873:prepare_executor section_start:1649845873:prepare_script Preparing environment
Running on ldir01u09.bbp.epfl.ch via bbpv1.epfl.ch...
section_end:1649845881:prepare_script section_start:1649845881:get_sources Getting source from Git repository
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1649845883:get_sources section_start:1649845883:download_artifacts Downloading artifacts
Downloading artifacts for build:neuron:nmodl:nvhpc:acc (211027)...
Runtime platform  arch=amd64 os=linux pid=68007 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=211027 responseStatus=200 OK token=sQF2SFDP
section_end:1649845884:download_artifacts section_start:1649845884:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ export CTEST_PARALLEL_LEVEL=${SLURM_TASKS_PER_NODE}
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ cd ${SPACK_BUILD_DIR}
$ export BOOST_TEST_COLOR_OUTPUT=no
$ i_am_a_failure=0
$ spack build-env ${SPACK_FULL_SPEC} -- ctest --output-on-failure -T Test || i_am_a_failure=1
Site: r1i4n0
Build name: Linux-nvc++
Create new tag: 20220413-1032 - Experimental
Test project /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211027/spack-build/spack-stage-neuron-develop-aw3evwlkdey4vmmjzpsuscd3bnfgndfy/spack-build-aw3evwl
Start 66: external_ringtest::coreneuron_cpu_mpi_offline::preparation
Start 71: external_ringtest::coreneuron_gpu_mpi_offline::preparation
Start 78: testcorenrn_bbcore::coreneuron_gpu_offline::preparation
Start 82: testcorenrn_bbcore::coreneuron_cpu_offline::preparation
Start 88: testcorenrn_conc::coreneuron_gpu_offline::preparation
Start 92: testcorenrn_conc::coreneuron_cpu_offline::preparation
Start 98: testcorenrn_deriv::coreneuron_gpu_offline::preparation
Start 102: testcorenrn_deriv::coreneuron_cpu_offline::preparation
Start 108: testcorenrn_gf::coreneuron_gpu_offline::preparation
Start 112: testcorenrn_gf::coreneuron_cpu_offline::preparation
Start 118: testcorenrn_kin::coreneuron_gpu_offline::preparation
Start 122: testcorenrn_kin::coreneuron_cpu_offline::preparation
1/183 Test #78: testcorenrn_bbcore::coreneuron_gpu_offline::preparation ............ Passed 3.17 sec
Start 127: testcorenrn_patstim::coreneuron_gpu_offline::preparation
2/183 Test #92: testcorenrn_conc::coreneuron_cpu_offline::preparation .............. Passed 3.19 sec
Start 61: external_ringtest::neuron
3/183 Test #88: testcorenrn_conc::coreneuron_gpu_offline::preparation .............. Passed 3.22 sec
Start 74: testcorenrn_bbcore::neuron
4/183 Test #118: testcorenrn_kin::coreneuron_gpu_offline::preparation ............... Passed 3.31 sec
Start 75: testcorenrn_bbcore::coreneuron_gpu_online
5/183 Test #82: testcorenrn_bbcore::coreneuron_cpu_offline::preparation ............ Passed 3.45 sec
Start 76: testcorenrn_bbcore::coreneuron_gpu_online_psolve_alternate
6/183 Test #122: testcorenrn_kin::coreneuron_cpu_offline::preparation ............... Passed 3.73 sec
Start 77: testcorenrn_bbcore::coreneuron_gpu_offline
7/183 Test #98: testcorenrn_deriv::coreneuron_gpu_offline::preparation ............. Passed 4.16 sec
Start 79: testcorenrn_bbcore::coreneuron_cpu_online
8/183 Test #102: testcorenrn_deriv::coreneuron_cpu_offline::preparation ............. Passed 4.26 sec
Start 80: testcorenrn_bbcore::coreneuron_cpu_online_psolve_alternate
9/183 Test #112: testcorenrn_gf::coreneuron_cpu_offline::preparation ................ Passed 7.76 sec
Start 130: testcorenrn_patstim::coreneuron_cpu_offline::preparation
10/183 Test #127: testcorenrn_patstim::coreneuron_gpu_offline::preparation ........... Passed 5.73 sec
Start 81: testcorenrn_bbcore::coreneuron_cpu_offline
11/183 Test #108: testcorenrn_gf::coreneuron_gpu_offline::preparation ................ Passed 9.13 sec
Start 136: testcorenrn_vecplay::coreneuron_gpu_offline::preparation
12/183 Test #74: testcorenrn_bbcore::neuron ......................................... Passed 6.11 sec
Start 84: testcorenrn_conc::neuron
13/183 Test #61: external_ringtest::neuron .......................................... Passed 6.84 sec
Start 85: testcorenrn_conc::coreneuron_gpu_online
14/183 Test #130: testcorenrn_patstim::coreneuron_cpu_offline::preparation ........... Passed 2.99 sec
Start 140: testcorenrn_vecplay::coreneuron_cpu_offline::preparation
15/183 Test #84: testcorenrn_conc::neuron ........................................... Passed 1.50 sec
Start 86: testcorenrn_conc::coreneuron_gpu_online_psolve_alternate
16/183 Test #136: testcorenrn_vecplay::coreneuron_gpu_offline::preparation ........... Passed 1.84 sec
Start 156: testcorenrn_watch::coreneuron_gpu_offline::preparation
17/183 Test #140: testcorenrn_vecplay::coreneuron_cpu_offline::preparation ........... Passed 2.09 sec
Start 160: testcorenrn_watch::coreneuron_cpu_offline::preparation
18/183 Test #160: testcorenrn_watch::coreneuron_cpu_offline::preparation ............. Passed 7.34 sec
Start 62: external_ringtest::neuron_mpi
19/183 Test #156: testcorenrn_watch::coreneuron_gpu_offline::preparation ............. Passed 11.48 sec
Start 63: external_ringtest::coreneuron_cpu_mpi_offline_saverestore
20/183 Test #62: external_ringtest::neuron_mpi ...................................... Passed 4.06 sec
Start 64: external_ringtest::coreneuron_cpu_mpi
21/183 Test #77: testcorenrn_bbcore::coreneuron_gpu_offline ......................... Passed 28.52 sec
Start 87: testcorenrn_conc::coreneuron_gpu_offline
22/183 Test #66: external_ringtest::coreneuron_cpu_mpi_offline::preparation ......... Passed 40.79 sec
Start 65: external_ringtest::coreneuron_cpu_mpi_offline
23/183 Test #85: testcorenrn_conc::coreneuron_gpu_online ............................ Passed 30.83 sec
Start 89: testcorenrn_conc::coreneuron_cpu_online
24/183 Test #79: testcorenrn_bbcore::coreneuron_cpu_online .......................... Passed 45.46 sec
Start 90: testcorenrn_conc::coreneuron_cpu_online_psolve_alternate
25/183 Test #80: testcorenrn_bbcore::coreneuron_cpu_online_psolve_alternate ......... Passed 45.41 sec
Start 91: testcorenrn_conc::coreneuron_cpu_offline
26/183 Test #71: external_ringtest::coreneuron_gpu_mpi_offline::preparation ......... Passed 49.76 sec
Start 68: external_ringtest::coreneuron_gpu_mpi_offline_saverestore
27/183 Test #75: testcorenrn_bbcore::coreneuron_gpu_online .......................... Passed 46.45 sec
Start 94: testcorenrn_deriv::neuron
28/183 Test #81: testcorenrn_bbcore::coreneuron_cpu_offline ......................... Passed 40.89 sec
Start 95: testcorenrn_deriv::coreneuron_gpu_online
29/183 Test #76: testcorenrn_bbcore::coreneuron_gpu_online_psolve_alternate ......... Passed 47.57 sec
Start 96: testcorenrn_deriv::coreneuron_gpu_online_psolve_alternate
30/183 Test #94: testcorenrn_deriv::neuron .......................................... Passed 2.08 sec
Start 97: testcorenrn_deriv::coreneuron_gpu_offline
31/183 Test #64: external_ringtest::coreneuron_cpu_mpi .............................. Passed 58.83 sec
Start 69: external_ringtest::coreneuron_gpu_mpi
32/183 Test #89: testcorenrn_conc::coreneuron_cpu_online ............................ Passed 42.69 sec
Start 99: testcorenrn_deriv::coreneuron_cpu_online
33/183 Test #87: testcorenrn_conc::coreneuron_gpu_offline ........................... Passed 51.51 sec
Start 100: testcorenrn_deriv::coreneuron_cpu_online_psolve_alternate
34/183 Test #86: testcorenrn_conc::coreneuron_gpu_online_psolve_alternate ........... Passed 73.01 sec
Start 101: testcorenrn_deriv::coreneuron_cpu_offline
35/183 Test #65: external_ringtest::coreneuron_cpu_mpi_offline ...................... Passed 53.56 sec
Start 70: external_ringtest::coreneuron_gpu_mpi_offline
36/183 Test #91: testcorenrn_conc::coreneuron_cpu_offline ........................... Passed 56.80 sec
Start 114: testcorenrn_kin::neuron
37/183 Test #95: testcorenrn_deriv::coreneuron_gpu_online ........................... Passed 57.56 sec
Start 115: testcorenrn_kin::coreneuron_gpu_online
38/183 Test #114: testcorenrn_kin::neuron ............................................ Passed 2.11 sec
Start 116: testcorenrn_kin::coreneuron_gpu_online_psolve_alternate
39/183 Test #90: testcorenrn_conc::coreneuron_cpu_online_psolve_alternate ........... Passed 69.85 sec
Start 117: testcorenrn_kin::coreneuron_gpu_offline
40/183 Test #97: testcorenrn_deriv::coreneuron_gpu_offline .......................... Passed 68.51 sec
Start 119: testcorenrn_kin::coreneuron_cpu_online
41/183 Test #101: testcorenrn_deriv::coreneuron_cpu_offline .......................... Passed 51.75 sec
Start 120: testcorenrn_kin::coreneuron_cpu_online_psolve_alternate
42/183 Test #99: testcorenrn_deriv::coreneuron_cpu_online ........................... Passed 52.14 sec
Start 121: testcorenrn_kin::coreneuron_cpu_offline
43/183 Test #100: testcorenrn_deriv::coreneuron_cpu_online_psolve_alternate .......... Passed 51.90 sec
Start 125: testcorenrn_patstim::coreneuron_gpu_offline_saverestore
44/183 Test #69: external_ringtest::coreneuron_gpu_mpi .............................. Passed 52.44 sec
Start 104: testcorenrn_gf::neuron
45/183 Test #96: testcorenrn_deriv::coreneuron_gpu_online_psolve_alternate .......... Passed 84.81 sec
Start 126: testcorenrn_patstim::coreneuron_gpu_offline
46/183 Test #104: testcorenrn_gf::neuron ............................................. Passed 2.12 sec
Start 105: testcorenrn_gf::coreneuron_gpu_online
47/183 Test #70: external_ringtest::coreneuron_gpu_mpi_offline ...................... Passed 62.55 sec
Start 106: testcorenrn_gf::coreneuron_gpu_online_psolve_alternate
48/183 Test #115: testcorenrn_kin::coreneuron_gpu_online ............................. Passed 54.61 sec
Start 165: channel_benchmark_hippo::neuron
49/183 Test #117: testcorenrn_kin::coreneuron_gpu_offline ............................ Passed 43.36 sec
Start 166: channel_benchmark_hippo::coreneuron_gpu_online
50/183 Test #119: testcorenrn_kin::coreneuron_cpu_online ............................. Passed 55.50 sec
Start 167: channel_benchmark_hippo::coreneuron_gpu_filemode
51/183 Test #116: testcorenrn_kin::coreneuron_gpu_online_psolve_alternate ............ Passed 74.84 sec
Start 168: channel_benchmark_hippo::coreneuron_cpu_online
52/183 Test #120: testcorenrn_kin::coreneuron_cpu_online_psolve_alternate ............ Passed 54.59 sec
Start 169: channel_benchmark_hippo::coreneuron_cpu_filemode
53/183 Test #121: testcorenrn_kin::coreneuron_cpu_offline ............................ Passed 66.04 sec
Start 171: channel_benchmark_sscx::neuron
54/183 Test #126: testcorenrn_patstim::coreneuron_gpu_offline ........................ Passed 74.46 sec
Start 172: channel_benchmark_sscx::coreneuron_gpu_online
55/183 Test #105: testcorenrn_gf::coreneuron_gpu_online .............................. Passed 100.89 sec
Start 107: testcorenrn_gf::coreneuron_gpu_offline
56/183 Test #106: testcorenrn_gf::coreneuron_gpu_online_psolve_alternate ............. Passed 106.52 sec
Start 109: testcorenrn_gf::coreneuron_cpu_online
57/183 Test #165: channel_benchmark_hippo::neuron .................................... Passed 118.91 sec
Start 173: channel_benchmark_sscx::coreneuron_gpu_filemode
58/183 Test #125: testcorenrn_patstim::coreneuron_gpu_offline_saverestore ............ Passed 163.79 sec
Start 174: channel_benchmark_sscx::coreneuron_cpu_online
59/183 Test #63: external_ringtest::coreneuron_cpu_mpi_offline_saverestore .......... Passed 278.93 sec
Start 110: testcorenrn_gf::coreneuron_cpu_online_psolve_alternate
60/183 Test #171: channel_benchmark_sscx::neuron ..................................... Passed 132.53 sec
Start 175: channel_benchmark_sscx::coreneuron_cpu_filemode
61/183 Test #166: channel_benchmark_hippo::coreneuron_gpu_online ..................... Passed 173.42 sec
Start 1: testneuron
62/183 Test #1: testneuron ......................................................... Passed 0.06 sec
Start 2: ringtest
63/183 Test #2: ringtest ........................................................... Passed 0.24 sec
Start 3: connect_dend
64/183 Test #3: connect_dend ....................................................... Passed 0.17 sec
Start 4: mpi_init::nrniv_mpiopt
65/183 Test #4: mpi_init::nrniv_mpiopt ............................................. Passed 0.52 sec
Start 5: mpi_init::nrniv_nrnmpi_init
66/183 Test #5: mpi_init::nrniv_nrnmpi_init ........................................ Passed 0.22 sec
Start 6: mpi_init::python_nrnmpi_init
67/183 Test #6: mpi_init::python_nrnmpi_init ....................................... Passed 0.78 sec
Start 7: mpi_init::python_mpienv
68/183 Test #7: mpi_init::python_mpienv ............................................ Passed 2.85 sec
Start 8: mpi_init::nrniv_mpiexec_mpiopt
69/183 Test #8: mpi_init::nrniv_mpiexec_mpiopt ..................................... Passed 1.47 sec
Start 9: mpi_init::nrniv_mpiexec_nrnmpi_init
70/183 Test #9: mpi_init::nrniv_mpiexec_nrnmpi_init ................................ Passed 1.36 sec
Start 10: mpi_init::python_mpiexec_nrnmpi_init
71/183 Test #107: testcorenrn_gf::coreneuron_gpu_offline ............................. Passed 105.47 sec
Start 111: testcorenrn_gf::coreneuron_cpu_offline
72/183 Test #10: mpi_init::python_mpiexec_nrnmpi_init ............................... Passed 1.87 sec
Start 11: mpi_init::python_mpiexec_mpienv
73/183 Test #11: mpi_init::python_mpiexec_mpienv .................................... Passed 2.60 sec
Start 12: pynrn::basic_tests
74/183 Test #109: testcorenrn_gf::coreneuron_cpu_online .............................. Passed 105.22 sec
Start 124: testcorenrn_patstim::neuron
75/183 Test #68: external_ringtest::coreneuron_gpu_mpi_offline_saverestore .......... Passed 324.95 sec
Start 128: testcorenrn_patstim::coreneuron_cpu_offline_saverestore
76/183 Test #124: testcorenrn_patstim::neuron ........................................ Passed 8.96 sec
Start 129: testcorenrn_patstim::coreneuron_cpu_offline
77/183 Test #12: pynrn::basic_tests ................................................. Passed 29.72 sec
Start 13: parallel_tests
78/183 Test #168: channel_benchmark_hippo::coreneuron_cpu_online ..................... Passed 199.56 sec
Start 14: parallel_partrans
79/183 Test #167: channel_benchmark_hippo::coreneuron_gpu_filemode ................... Passed 207.47 sec
Start 15: parallel_netpar
80/183 Test #169: channel_benchmark_hippo::coreneuron_cpu_filemode ................... Passed 193.21 sec
Start 16: parallel_bas
81/183 Test #13: parallel_tests ..................................................... Passed 7.29 sec
Start 17: coreneuron_modtests::version_macros
82/183 Test #14: parallel_partrans .................................................. Passed 3.32 sec
Start 18: coreneuron_modtests::fornetcon_py_cpu
83/183 Test #15: parallel_netpar .................................................... Passed 3.18 sec
Start 19: coreneuron_modtests::direct_py_cpu
84/183 Test #16: parallel_bas ....................................................... Passed 4.84 sec
Start 20: coreneuron_modtests::direct_hoc_cpu
85/183 Test #172: channel_benchmark_sscx::coreneuron_gpu_online ...................... Passed 180.04 sec
Start 21: coreneuron_modtests::spikes_py_cpu
86/183 Test #111: testcorenrn_gf::coreneuron_cpu_offline ............................. Passed 65.77 sec
Start 132: testcorenrn_vecplay::neuron
87/183 Test #174: channel_benchmark_sscx::coreneuron_cpu_online ...................... Passed 111.09 sec
Start 22: coreneuron_modtests::spikes_file_mode_py_cpu
88/183 Test #173: channel_benchmark_sscx::coreneuron_gpu_filemode .................... Passed 130.99 sec
Start 23: coreneuron_modtests::fast_imem_py_cpu
89/183 Test #132: testcorenrn_vecplay::neuron ........................................ Passed 2.68 sec
Start 133: testcorenrn_vecplay::coreneuron_gpu_online
90/183 Test #23: coreneuron_modtests::fast_imem_py_cpu .............................. Passed 2.30 sec
Start 24: coreneuron_modtests::datareturn_py_cpu
91/183 Test #129: testcorenrn_patstim::coreneuron_cpu_offline ........................ Passed 36.81 sec
Start 134: testcorenrn_vecplay::coreneuron_gpu_online_psolve_alternate
92/183 Test #19: coreneuron_modtests::direct_py_cpu ................................. Passed 30.62 sec
Start 25: coreneuron_modtests::test_units_py_cpu
93/183 Test #17: coreneuron_modtests::version_macros ................................ Passed 31.80 sec
Start 26: coreneuron_modtests::test_netmove_py_cpu
94/183 Test #18: coreneuron_modtests::fornetcon_py_cpu .............................. Passed 30.94 sec
Start 27: coreneuron_modtests::test_pointer_py_cpu
95/183 Test #110: testcorenrn_gf::coreneuron_cpu_online_psolve_alternate ............. Passed 116.72 sec
Start 135: testcorenrn_vecplay::coreneuron_gpu_offline
96/183 Test #175: channel_benchmark_sscx::coreneuron_cpu_filemode .................... Passed 84.81 sec
Start 28: coreneuron_modtests::test_watchrange_py_cpu
97/183 Test #20: coreneuron_modtests::direct_hoc_cpu ................................ Passed 43.81 sec
Start 29: coreneuron_modtests::test_psolve_py_cpu
98/183 Test #21: coreneuron_modtests::spikes_py_cpu ................................. Passed 42.18 sec
Start 30: coreneuron_modtests::test_ba_py_cpu
99/183 Test #22: coreneuron_modtests::spikes_file_mode_py_cpu ....................... Passed 31.20 sec
Start 31: coreneuron_modtests::test_natrans_py_cpu
100/183 Test #24: coreneuron_modtests::datareturn_py_cpu ............................. Passed 37.23 sec
Start 35: coreneuron_modtests::fornetcon_py_gpu
101/183 Test #25: coreneuron_modtests::test_units_py_cpu ............................. Passed 39.90 sec
Start 36: coreneuron_modtests::direct_py_gpu
102/183 Test #28: coreneuron_modtests::test_watchrange_py_cpu ........................ Passed 37.99 sec
Start 37: coreneuron_modtests::direct_hoc_gpu
103/183 Test #26: coreneuron_modtests::test_netmove_py_cpu ........................... Passed 39.88 sec
Start 38: coreneuron_modtests::spikes_py_gpu
104/183 Test #133: testcorenrn_vecplay::coreneuron_gpu_online ......................... Passed 45.25 sec
Start 137: testcorenrn_vecplay::coreneuron_cpu_online
105/183 Test #135: testcorenrn_vecplay::coreneuron_gpu_offline ........................ Passed 39.84 sec
Start 138: testcorenrn_vecplay::coreneuron_cpu_online_psolve_alternate
106/183 Test #128: testcorenrn_patstim::coreneuron_cpu_offline_saverestore ............ Passed 83.45 sec
Start 139: testcorenrn_vecplay::coreneuron_cpu_offline
107/183 Test #134: testcorenrn_vecplay::coreneuron_gpu_online_psolve_alternate ........ Passed 45.48 sec
Start 152: testcorenrn_watch::neuron
108/183 Test #152: testcorenrn_watch::neuron .......................................... Passed 1.56 sec
Start 153: testcorenrn_watch::coreneuron_gpu_online
109/183 Test #30: coreneuron_modtests::test_ba_py_cpu ................................ Passed 30.19 sec
Start 39: coreneuron_modtests::spikes_file_mode_py_gpu
110/183 Test #29: coreneuron_modtests::test_psolve_py_cpu ............................ Passed 30.69 sec
Start 40: coreneuron_modtests::fast_imem_py_gpu
111/183 Test #40: coreneuron_modtests::fast_imem_py_gpu .............................. Passed 0.66 sec
Start 41: coreneuron_modtests::datareturn_py_gpu
112/183 Test #31: coreneuron_modtests::test_natrans_py_cpu ........................... Passed 27.79 sec
Start 42: coreneuron_modtests::test_units_py_gpu
113/183 Test #35: coreneuron_modtests::fornetcon_py_gpu .............................. Passed 31.72 sec
Start 43: coreneuron_modtests::test_netmove_py_gpu
114/183 Test #37: coreneuron_modtests::direct_hoc_gpu ................................ Passed 28.52 sec
Start 44: coreneuron_modtests::test_pointer_py_gpu
115/183 Test #139: testcorenrn_vecplay::coreneuron_cpu_offline ........................ Passed 28.52 sec
Start 154: testcorenrn_watch::coreneuron_gpu_online_psolve_alternate
116/183 Test #36: coreneuron_modtests::direct_py_gpu ................................. Passed 40.91 sec
Start 45: coreneuron_modtests::test_watchrange_py_gpu
117/183 Test #38: coreneuron_modtests::spikes_py_gpu ................................. Passed 40.88 sec
Start 46: coreneuron_modtests::test_psolve_py_gpu
118/183 Test #39: coreneuron_modtests::spikes_file_mode_py_gpu ....................... Passed 35.31 sec
Start 47: coreneuron_modtests::test_ba_py_gpu
119/183 Test #41: coreneuron_modtests::datareturn_py_gpu ............................. Passed 34.91 sec
Start 48: coreneuron_modtests::test_natrans_py_gpu
120/183 Test #137: testcorenrn_vecplay::coreneuron_cpu_online ......................... Passed 40.49 sec
Start 155: testcorenrn_watch::coreneuron_gpu_offline
121/183 Test #138: testcorenrn_vecplay::coreneuron_cpu_online_psolve_alternate ........ Passed 40.92 sec
Start 157: testcorenrn_watch::coreneuron_cpu_online
122/183 Test #153: testcorenrn_watch::coreneuron_gpu_online ........................... Passed 37.70 sec
Start 158: testcorenrn_watch::coreneuron_cpu_online_psolve_alternate
123/183 Test #42: coreneuron_modtests::test_units_py_gpu ............................. Passed 37.25 sec
Start 52: modlunit_unitstest
124/183 Test #52: modlunit_unitstest ................................................. Passed 0.02 sec
Start 53: modlunit_hh
125/183 Test #53: modlunit_hh ........................................................ Passed 0.03 sec
Start 54: modlunit_stim
126/183 Test #54: modlunit_stim ...................................................... Passed 0.03 sec
Start 55: modlunit_pattern
127/183 Test #55: modlunit_pattern ................................................... Passed 0.01 sec
Start 56: external_nrntest
128/183 Test #43: coreneuron_modtests::test_netmove_py_gpu ........................... Passed 36.15 sec
Start 83: testcorenrn_bbcore::compare_results
129/183 Test #83: testcorenrn_bbcore::compare_results ................................ Passed 0.06 sec
Start 93: testcorenrn_conc::compare_results
130/183 Test #93: testcorenrn_conc::compare_results .................................. Passed 0.05 sec
Start 103: testcorenrn_deriv::compare_results
131/183 Test #103: testcorenrn_deriv::compare_results ................................. Passed 0.05 sec
Start 113: testcorenrn_gf::compare_results
132/183 Test #113: testcorenrn_gf::compare_results .................................... Passed 0.06 sec
Start 123: testcorenrn_kin::compare_results
133/183 Test #123: testcorenrn_kin::compare_results ................................... Passed 0.05 sec
Start 131: testcorenrn_patstim::compare_results
134/183 Test #131: testcorenrn_patstim::compare_results ............................... Passed 0.06 sec
Start 141: testcorenrn_vecplay::compare_results
135/183 Test #141: testcorenrn_vecplay::compare_results ............................... Passed 0.06 sec
Start 170: channel_benchmark_hippo::compare_results
136/183 Test #170: channel_benchmark_hippo::compare_results ........................... Passed 0.05 sec
Start 176: channel_benchmark_sscx::compare_results
137/183 Test #176: channel_benchmark_sscx::compare_results ............................ Passed 0.18 sec
138/183 Test #47: coreneuron_modtests::test_ba_py_gpu ................................ Passed 30.61 sec
Start 159: testcorenrn_watch::coreneuron_cpu_offline
139/183 Test #45: coreneuron_modtests::test_watchrange_py_gpu ........................ Passed 30.75 sec
140/183 Test #46: coreneuron_modtests::test_psolve_py_gpu ............................ Passed 30.79 sec
Start 162: testcorenrn_netstimdirect::direct_netstimdirect
141/183 Test #48: coreneuron_modtests::test_natrans_py_gpu ........................... Passed 30.53 sec
142/183 Test #155: testcorenrn_watch::coreneuron_gpu_offline .......................... Passed 35.90 sec
Start 163: testcorenrn_netstimdirect::direct_netstimdirect_psolve_alternate
143/183 Test #157: testcorenrn_watch::coreneuron_cpu_online ........................... Passed 36.26 sec
Start 32: coreneuron_modtests::spikes_mpi_py_cpu
144/183 Test #158: testcorenrn_watch::coreneuron_cpu_online_psolve_alternate .......... Passed 38.59 sec
Start 33: coreneuron_modtests::spikes_mpi_file_mode_py_cpu
145/183 Test #154: testcorenrn_watch::coreneuron_gpu_online_psolve_alternate .......... Passed 57.04 sec
Start 34: coreneuron_modtests::inputpresyn_py_cpu
146/183 Test #159: testcorenrn_watch::coreneuron_cpu_offline .......................... Passed 24.42 sec
Start 49: coreneuron_modtests::spikes_mpi_py_gpu
Start 161: testcorenrn_watch::compare_results
147/183 Test #161: testcorenrn_watch::compare_results ................................. Passed 0.19 sec
148/183 Test #162: testcorenrn_netstimdirect::direct_netstimdirect .................... Passed 26.13 sec
Start 50: coreneuron_modtests::spikes_mpi_file_mode_py_gpu
149/183 Test #32: coreneuron_modtests::spikes_mpi_py_cpu ............................. Passed 24.85 sec
Start 51: coreneuron_modtests::inputpresyn_py_gpu
150/183 Test #163: testcorenrn_netstimdirect::direct_netstimdirect_psolve_alternate ... Passed 26.89 sec
Start 164: testcorenrn_netstimdirect::compare_results
151/183 Test #164: testcorenrn_netstimdirect::compare_results ......................... Passed 0.58 sec
152/183 Test #33: coreneuron_modtests::spikes_mpi_file_mode_py_cpu ................... Passed 25.26 sec
Start 146: testcorenrn_vecevent::coreneuron_gpu_offline::preparation
153/183 Test #34: coreneuron_modtests::inputpresyn_py_cpu ............................ Passed 21.81 sec
154/183 Test #146: testcorenrn_vecevent::coreneuron_gpu_offline::preparation .......... Passed 29.59 sec
Start 150: testcorenrn_vecevent::coreneuron_cpu_offline::preparation
155/183 Test #150: testcorenrn_vecevent::coreneuron_cpu_offline::preparation .......... Passed 3.38 sec
Start 178: olfactory-bulb-3d::neuron::preparation
156/183 Test #178: olfactory-bulb-3d::neuron::preparation ............................. Passed 0.02 sec
Start 180: olfactory-bulb-3d::coreneuron_gpu_online::preparation
157/183 Test #180: olfactory-bulb-3d::coreneuron_gpu_online::preparation .............. Passed 0.01 sec
Start 182: olfactory-bulb-3d::coreneuron_cpu_online::preparation
158/183 Test #182: olfactory-bulb-3d::coreneuron_cpu_online::preparation .............. Passed 0.02 sec
Start 57: reduced_dentate::neuron
159/183 Test #49: coreneuron_modtests::spikes_mpi_py_gpu ............................. Passed 50.81 sec
Start 58: reduced_dentate::coreneuron_cpu
160/183 Test #51: coreneuron_modtests::inputpresyn_py_gpu ............................ Passed 43.92 sec
161/183 Test #50: coreneuron_modtests::spikes_mpi_file_mode_py_gpu ................... Passed 49.27 sec
Start 59: reduced_dentate::coreneuron_gpu
162/183 Test #57: reduced_dentate::neuron ............................................ Passed 12.18 sec
Start 142: testcorenrn_vecevent::neuron
163/183 Test #27: coreneuron_modtests::test_pointer_py_cpu ........................... Passed 198.84 sec
164/183 Test #44: coreneuron_modtests::test_pointer_py_gpu ........................... Passed 155.29 sec
165/183 Test #142: testcorenrn_vecevent::neuron ....................................... Passed 60.20 sec
Start 67: external_ringtest::coreneuron_cpu_mpi_threads
166/183 Test #58: reduced_dentate::coreneuron_cpu .................................... Passed 81.24 sec
Start 143: testcorenrn_vecevent::coreneuron_gpu_online
167/183 Test #59: reduced_dentate::coreneuron_gpu .................................... Passed 81.13 sec
Start 144: testcorenrn_vecevent::coreneuron_gpu_online_psolve_alternate
Start 60: reduced_dentate::compare_results
168/183 Test #60: reduced_dentate::compare_results ................................... Passed 0.17 sec
169/183 Test #67: external_ringtest::coreneuron_cpu_mpi_threads ...................... Passed 19.75 sec
Start 72: external_ringtest::coreneuron_gpu_mpi_threads
170/183 Test #143: testcorenrn_vecevent::coreneuron_gpu_online ........................ Passed 11.12 sec
Start 145: testcorenrn_vecevent::coreneuron_gpu_offline
171/183 Test #72: external_ringtest::coreneuron_gpu_mpi_threads ...................... Passed 25.43 sec
Start 147: testcorenrn_vecevent::coreneuron_cpu_online
Start 73: external_ringtest::compare_results
172/183 Test #73: external_ringtest::compare_results ................................. Passed 0.07 sec
173/183 Test #144: testcorenrn_vecevent::coreneuron_gpu_online_psolve_alternate ....... Passed 34.10 sec
Start 148: testcorenrn_vecevent::coreneuron_cpu_online_psolve_alternate
174/183 Test #145: testcorenrn_vecevent::coreneuron_gpu_offline ....................... Passed 24.02 sec
Start 149: testcorenrn_vecevent::coreneuron_cpu_offline
175/183 Test #147: testcorenrn_vecevent::coreneuron_cpu_online ........................ Passed 9.17 sec
Start 177: olfactory-bulb-3d::neuron
176/183 Test #149: testcorenrn_vecevent::coreneuron_cpu_offline ....................... Passed 11.37 sec
Start 179: olfactory-bulb-3d::coreneuron_gpu_online
177/183 Test #148: testcorenrn_vecevent::coreneuron_cpu_online_psolve_alternate ....... Passed 35.82 sec
Start 181: olfactory-bulb-3d::coreneuron_cpu_online
Start 151: testcorenrn_vecevent::compare_results
178/183 Test #151: testcorenrn_vecevent::compare_results .............................. Passed 0.07 sec
179/183 Test #56: external_nrntest ................................................... Passed 249.65 sec
180/183 Test #179: olfactory-bulb-3d::coreneuron_gpu_online ........................... Passed 119.24 sec
181/183 Test #177: olfactory-bulb-3d::neuron .......................................... Passed 161.87 sec
182/183 Test #181: olfactory-bulb-3d::coreneuron_cpu_online ........................... Passed 130.58 sec
Start 183: olfactory-bulb-3d::compare_results
183/183 Test #183: olfactory-bulb-3d::compare_results ................................. Passed 0.23 sec
100% tests passed, 0 tests failed out of 183
Total Test time (real) = 886.24 sec
$ cp -r Testing/ ${CI_PROJECT_DIR}/
$ module load unstable unit-test-translator
Autoloading python/3.9.7
$ cmake2junit > ${CI_PROJECT_DIR}/ctest.xml
$ exit ${i_am_a_failure}
section_end:1649846821:step_script section_start:1649846821:upload_artifacts_on_success Uploading artifacts for successful job
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=10658 revision=58ba2b95 version=14.2.0
initial_environment.env: found 1 matching files and directories
Testing/: found 7 matching files and directories 
Uploading artifacts as "archive" to coordinator... ok id=211038 responseStatus=201 Created token=s5GYjSa3
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=10713 revision=58ba2b95 version=14.2.0
ctest.xml: found 1 matching files and directories 
Uploading artifacts as "junit" to coordinator... ok id=211038 responseStatus=201 Created token=s5GYjSa3
section_end:1649846823:upload_artifacts_on_success section_start:1649846823:cleanup_file_variables Cleaning up project directory and file based variables
section_end:1649846824:cleanup_file_variables Job succeeded
Running with gitlab-runner 14.3.2 (e0218c92)
 on BB5 map runner pnPo3yJy
section_start:1649845839:resolve_secrets Resolving secrets
section_end:1649845839:resolve_secrets section_start:1649845839:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 14.3.2, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor991264248, slurm job id
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 48472, build ref 9a466a6e2a1c807e4d6045f3fc0b1fbaf51370b6, job ID 211037
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472, optional exclusive flag , optional cpus per task flag --cpus-per-task=1, optional qos flag
A slurm job will be created with name GL_J211037_PROD_P112_CP5_C5
Job parameters: memory=76G, cpus_per_task=1, duration=1:00:00, constraint=volta ntasks=16 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
Submitted batch job 379529
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=16 --cpus-per-task=1 --mem=76G --job-name=GL_J211037_PROD_P112_CP5_C5 -C volta --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P48472 --ntasks=16 --jobid=379529 --cpus-per-task=1 --mem=76G
section_end:1649845840:prepare_executor section_start:1649845840:prepare_script Preparing environment
Running on ldir01u09.bbp.epfl.ch via bbpv1.epfl.ch...
section_end:1649845848:prepare_script section_start:1649845848:get_sources Getting source from Git repository
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1649845849:get_sources section_start:1649845849:download_artifacts Downloading artifacts
Downloading artifacts for build:neuron:nmodl:nvhpc:omp (211026)...
Runtime platform  arch=amd64 os=linux pid=65485 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=211026 responseStatus=200 OK token=Pm8KebQh
section_end:1649845851:download_artifacts section_start:1649845851:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ export CTEST_PARALLEL_LEVEL=${SLURM_TASKS_PER_NODE}
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ cd ${SPACK_BUILD_DIR}
$ export BOOST_TEST_COLOR_OUTPUT=no
$ i_am_a_failure=0
$ spack build-env ${SPACK_FULL_SPEC} -- ctest --output-on-failure -T Test || i_am_a_failure=1
Site: r1i4n0
Build name: Linux-nvc++
Create new tag: 20220413-1031 - Experimental
Test project /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P48472/J211026/spack-build/spack-stage-neuron-develop-7rnzemtajzavcycpixj4vl565kzy3d6v/spack-build-7rnzemt
Start 66: external_ringtest::coreneuron_cpu_mpi_offline::preparation
Start 71: external_ringtest::coreneuron_gpu_mpi_offline::preparation
Start 78: testcorenrn_bbcore::coreneuron_gpu_offline::preparation
Start 82: testcorenrn_bbcore::coreneuron_cpu_offline::preparation
Start 88: testcorenrn_conc::coreneuron_gpu_offline::preparation
Start 92: testcorenrn_conc::coreneuron_cpu_offline::preparation
Start 98: testcorenrn_deriv::coreneuron_gpu_offline::preparation
Start 102: testcorenrn_deriv::coreneuron_cpu_offline::preparation
Start 108: testcorenrn_gf::coreneuron_gpu_offline::preparation
Start 112: testcorenrn_gf::coreneuron_cpu_offline::preparation
Start 118: testcorenrn_kin::coreneuron_gpu_offline::preparation
Start 122: testcorenrn_kin::coreneuron_cpu_offline::preparation
1/183 Test #78: testcorenrn_bbcore::coreneuron_gpu_offline::preparation ............ Passed 2.36 sec
Start 127: testcorenrn_patstim::coreneuron_gpu_offline::preparation
2/183 Test #88: testcorenrn_conc::coreneuron_gpu_offline::preparation .............. Passed 3.64 sec
Start 61: external_ringtest::neuron
3/183 Test #92: testcorenrn_conc::coreneuron_cpu_offline::preparation .............. Passed 3.78 sec
Start 74: testcorenrn_bbcore::neuron
4/183 Test #82: testcorenrn_bbcore::coreneuron_cpu_offline::preparation ............ Passed 3.90 sec
Start 75: testcorenrn_bbcore::coreneuron_gpu_online
5/183 Test #98: testcorenrn_deriv::coreneuron_gpu_offline::preparation ............. Passed 3.89 sec
Start 76: testcorenrn_bbcore::coreneuron_gpu_online_psolve_alternate
6/183 Test #118: testcorenrn_kin::coreneuron_gpu_offline::preparation ............... Passed 4.10 sec
Start 77: testcorenrn_bbcore::coreneuron_gpu_offline
7/183 Test #102: testcorenrn_deriv::coreneuron_cpu_offline::preparation ............. Passed 4.14 sec
Start 79: testcorenrn_bbcore::coreneuron_cpu_online
8/183 Test #122: testcorenrn_kin::coreneuron_cpu_offline::preparation ............... Passed 4.35 sec
Start 80: testcorenrn_bbcore::coreneuron_cpu_online_psolve_alternate
9/183 Test #112: testcorenrn_gf::coreneuron_cpu_offline::preparation ................ Passed 4.52 sec
Start 130: testcorenrn_patstim::coreneuron_cpu_offline::preparation
10/183 Test #108: testcorenrn_gf::coreneuron_gpu_offline::preparation ................ Passed 4.58 sec
Start 136: testcorenrn_vecplay::coreneuron_gpu_offline::preparation
11/183 Test #127: testcorenrn_patstim::coreneuron_gpu_offline::preparation ........... Passed 2.28 sec
Start 81: testcorenrn_bbcore::coreneuron_cpu_offline
12/183 Test #61: external_ringtest::neuron .......................................... Passed 3.43 sec
Start 84: testcorenrn_conc::neuron
13/183 Test #74: testcorenrn_bbcore::neuron ......................................... Passed 4.64 sec
Start 85: testcorenrn_conc::coreneuron_gpu_online
14/183 Test #136: testcorenrn_vecplay::coreneuron_gpu_offline::preparation ........... Passed 4.82 sec
Start 140: testcorenrn_vecplay::coreneuron_cpu_offline::preparation
15/183 Test #84: testcorenrn_conc::neuron ........................................... Passed 2.44 sec
Start 86: testcorenrn_conc::coreneuron_gpu_online_psolve_alternate
16/183 Test #130: testcorenrn_patstim::coreneuron_cpu_offline::preparation ........... Passed 5.84 sec
Start 156: testcorenrn_watch::coreneuron_gpu_offline::preparation
17/183 Test #71: external_ringtest::coreneuron_gpu_mpi_offline::preparation ......... Passed 11.16 sec
Start 160: testcorenrn_watch::coreneuron_cpu_offline::preparation
18/183 Test #66: external_ringtest::coreneuron_cpu_mpi_offline::preparation ......... Passed 11.21 sec
Start 62: external_ringtest::neuron_mpi
19/183 Test #140: testcorenrn_vecplay::coreneuron_cpu_offline::preparation ........... Passed 2.62 sec
Start 63: external_ringtest::coreneuron_cpu_mpi_offline_saverestore
20/183 Test #156: testcorenrn_watch::coreneuron_gpu_offline::preparation ............. Passed 4.75 sec
Start 64: external_ringtest::coreneuron_cpu_mpi
21/183 Test #160: testcorenrn_watch::coreneuron_cpu_offline::preparation ............. Passed 4.50 sec
Start 65: external_ringtest::coreneuron_cpu_mpi_offline
22/183 Test #80: testcorenrn_bbcore::coreneuron_cpu_online_psolve_alternate ......... Passed 13.47 sec
Start 87: testcorenrn_conc::coreneuron_gpu_offline
23/183 Test #81: testcorenrn_bbcore::coreneuron_cpu_offline ......................... Passed 13.41 sec
Start 89: testcorenrn_conc::coreneuron_cpu_online
24/183 Test #79: testcorenrn_bbcore::coreneuron_cpu_online .......................... Passed 14.05 sec
Start 90: testcorenrn_conc::coreneuron_cpu_online_psolve_alternate
25/183 Test #77: testcorenrn_bbcore::coreneuron_gpu_offline ......................... Passed 15.50 sec
Start 91: testcorenrn_conc::coreneuron_cpu_offline
26/183 Test #75: testcorenrn_bbcore::coreneuron_gpu_online .......................... Passed 16.47 sec
Start 94: testcorenrn_deriv::neuron
27/183 Test #85: testcorenrn_conc::coreneuron_gpu_online ............................ Passed 13.32 sec
Start 95: testcorenrn_deriv::coreneuron_gpu_online
28/183 Test #76: testcorenrn_bbcore::coreneuron_gpu_online_psolve_alternate ......... Passed 18.06 sec
Start 96: testcorenrn_deriv::coreneuron_gpu_online_psolve_alternate
29/183 Test #62: external_ringtest::neuron_mpi ...................................... Passed 15.37 sec
Start 68: external_ringtest::coreneuron_gpu_mpi_offline_saverestore
30/183 Test #91: testcorenrn_conc::coreneuron_cpu_offline ........................... Passed 8.22 sec
Start 97: testcorenrn_deriv::coreneuron_gpu_offline
31/183 Test #86: testcorenrn_conc::coreneuron_gpu_online_psolve_alternate ........... Passed 19.63 sec
Start 99: testcorenrn_deriv::coreneuron_cpu_online
32/183 Test #94: testcorenrn_deriv::neuron .......................................... Passed 11.04 sec
Start 100: testcorenrn_deriv::coreneuron_cpu_online_psolve_alternate
33/183 Test #65: external_ringtest::coreneuron_cpu_mpi_offline ...................... Passed 16.06 sec
Start 69: external_ringtest::coreneuron_gpu_mpi
34/183 Test #87: testcorenrn_conc::coreneuron_gpu_offline ........................... Passed 14.03 sec
Start 101: testcorenrn_deriv::coreneuron_cpu_offline
35/183 Test #89: testcorenrn_conc::coreneuron_cpu_online ............................ Passed 51.79 sec
Start 114: testcorenrn_kin::neuron
36/183 Test #101: testcorenrn_deriv::coreneuron_cpu_offline .......................... Passed 39.81 sec
Start 115: testcorenrn_kin::coreneuron_gpu_online
37/183 Test #114: testcorenrn_kin::neuron ............................................ Passed 2.37 sec
Start 116: testcorenrn_kin::coreneuron_gpu_online_psolve_alternate
38/183 Test #90: testcorenrn_conc::coreneuron_cpu_online_psolve_alternate ........... Passed 57.33 sec
Start 117: testcorenrn_kin::coreneuron_gpu_offline
39/183 Test #99: testcorenrn_deriv::coreneuron_cpu_online ........................... Passed 46.83 sec
Start 119: testcorenrn_kin::coreneuron_cpu_online
40/183 Test #100: testcorenrn_deriv::coreneuron_cpu_online_psolve_alternate .......... Passed 46.15 sec
Start 120: testcorenrn_kin::coreneuron_cpu_online_psolve_alternate
41/183 Test #97: testcorenrn_deriv::coreneuron_gpu_offline .......................... Passed 49.94 sec
Start 121: testcorenrn_kin::coreneuron_cpu_offline
42/183 Test #95: testcorenrn_deriv::coreneuron_gpu_online ........................... Passed 56.00 sec
Start 125: testcorenrn_patstim::coreneuron_gpu_offline_saverestore
43/183 Test #69: external_ringtest::coreneuron_gpu_mpi .............................. Passed 46.13 sec
Start 70: external_ringtest::coreneuron_gpu_mpi_offline
44/183 Test #64: external_ringtest::coreneuron_cpu_mpi .............................. Passed 62.69 sec
Start 104: testcorenrn_gf::neuron
45/183 Test #96: testcorenrn_deriv::coreneuron_gpu_online_psolve_alternate .......... Passed 59.43 sec
Start 126: testcorenrn_patstim::coreneuron_gpu_offline
46/183 Test #104: testcorenrn_gf::neuron ............................................. Passed 3.52 sec
Start 105: testcorenrn_gf::coreneuron_gpu_online
47/183 Test #115: testcorenrn_kin::coreneuron_gpu_online ............................. Passed 35.54 sec
Start 165: channel_benchmark_hippo::neuron
48/183 Test #119: testcorenrn_kin::coreneuron_cpu_online ............................. Passed 42.40 sec
Start 166: channel_benchmark_hippo::coreneuron_gpu_online
49/183 Test #116: testcorenrn_kin::coreneuron_gpu_online_psolve_alternate ............ Passed 49.77 sec
Start 167: channel_benchmark_hippo::coreneuron_gpu_filemode
50/183 Test #121: testcorenrn_kin::coreneuron_cpu_offline ............................ Passed 45.11 sec
Start 168: channel_benchmark_hippo::coreneuron_cpu_online
51/183 Test #117: testcorenrn_kin::coreneuron_gpu_offline ............................ Passed 48.05 sec
Start 169: channel_benchmark_hippo::coreneuron_cpu_filemode
52/183 Test #120: testcorenrn_kin::coreneuron_cpu_online_psolve_alternate ............ Passed 46.29 sec
Start 171: channel_benchmark_sscx::neuron
53/183 Test #126: testcorenrn_patstim::coreneuron_gpu_offline ........................ Passed 58.79 sec
Start 172: channel_benchmark_sscx::coreneuron_gpu_online
54/183 Test #70: external_ringtest::coreneuron_gpu_mpi_offline ...................... Passed 70.73 sec
Start 106: testcorenrn_gf::coreneuron_gpu_online_psolve_alternate
55/183 Test #105: testcorenrn_gf::coreneuron_gpu_online .............................. Passed 82.79 sec
Start 107: testcorenrn_gf::coreneuron_gpu_offline
56/183 Test #125: testcorenrn_patstim::coreneuron_gpu_offline_saverestore ............ Passed 119.33 sec
Start 173: channel_benchmark_sscx::coreneuron_gpu_filemode
57/183 Test #63: external_ringtest::coreneuron_cpu_mpi_offline_saverestore .......... Passed 237.95 sec
Start 109: testcorenrn_gf::coreneuron_cpu_online
58/183 Test #165: channel_benchmark_hippo::neuron .................................... Passed 162.15 sec
Start 174: channel_benchmark_sscx::coreneuron_cpu_online
59/183 Test #68: external_ringtest::coreneuron_gpu_mpi_offline_saverestore .......... Passed 250.71 sec
Start 110: testcorenrn_gf::coreneuron_cpu_online_psolve_alternate
60/183 Test #171: channel_benchmark_sscx::neuron ..................................... Passed 158.82 sec
Start 175: channel_benchmark_sscx::coreneuron_cpu_filemode
61/183 Test #107: testcorenrn_gf::coreneuron_gpu_offline ............................. Passed 130.90 sec
Start 111: testcorenrn_gf::coreneuron_cpu_offline
62/183 Test #106: testcorenrn_gf::coreneuron_gpu_online_psolve_alternate ............. Passed 146.68 sec
Start 124: testcorenrn_patstim::neuron
63/183 Test #124: testcorenrn_patstim::neuron ........................................ Passed 2.49 sec
Start 128: testcorenrn_patstim::coreneuron_cpu_offline_saverestore
64/183 Test #166: channel_benchmark_hippo::coreneuron_gpu_online ..................... Passed 229.04 sec
Start 1: testneuron
65/183 Test #1: testneuron ......................................................... Passed 0.06 sec
Start 2: ringtest
66/183 Test #2: ringtest ........................................................... Passed 0.24 sec
Start 3: connect_dend
67/183 Test #3: connect_dend ....................................................... Passed 0.14 sec
Start 4: mpi_init::nrniv_mpiopt
68/183 Test #4: mpi_init::nrniv_mpiopt ............................................. Passed 0.07 sec
Start 5: mpi_init::nrniv_nrnmpi_init
69/183 Test #5: mpi_init::nrniv_nrnmpi_init ........................................ Passed 0.07 sec
Start 6: mpi_init::python_nrnmpi_init
70/183 Test #6: mpi_init::python_nrnmpi_init ....................................... Passed 0.31 sec
Start 7: mpi_init::python_mpienv
71/183 Test #7: mpi_init::python_mpienv ............................................ Passed 0.39 sec
Start 8: mpi_init::nrniv_mpiexec_mpiopt
72/183 Test #8: mpi_init::nrniv_mpiexec_mpiopt ..................................... Passed 1.31 sec
Start 9: mpi_init::nrniv_mpiexec_nrnmpi_init
73/183 Test #9: mpi_init::nrniv_mpiexec_nrnmpi_init ................................ Passed 1.35 sec
Start 10: mpi_init::python_mpiexec_nrnmpi_init
74/183 Test #109: testcorenrn_gf::coreneuron_cpu_online .............................. Passed 102.14 sec
Start 129: testcorenrn_patstim::coreneuron_cpu_offline
75/183 Test #10: mpi_init::python_mpiexec_nrnmpi_init ............................... Passed 1.56 sec
Start 11: mpi_init::python_mpiexec_mpienv
76/183 Test #11: mpi_init::python_mpiexec_mpienv .................................... Passed 1.77 sec
Start 12: pynrn::basic_tests
77/183 Test #168: channel_benchmark_hippo::coreneuron_cpu_online ..................... Passed 233.43 sec
Start 13: parallel_tests
78/183 Test #169: channel_benchmark_hippo::coreneuron_cpu_filemode ................... Passed 232.98 sec
Start 14: parallel_partrans
79/183 Test #172: channel_benchmark_sscx::coreneuron_gpu_online ...................... Passed 221.72 sec
Start 15: parallel_netpar
80/183 Test #12: pynrn::basic_tests ................................................. Passed 10.01 sec
Start 16: parallel_bas
81/183 Test #14: parallel_partrans .................................................. Passed 8.18 sec
Start 17: coreneuron_modtests::version_macros
82/183 Test #13: parallel_tests ..................................................... Passed 9.69 sec
Start 18: coreneuron_modtests::fornetcon_py_cpu
83/183 Test #167: channel_benchmark_hippo::coreneuron_gpu_filemode ................... Passed 256.66 sec
Start 19: coreneuron_modtests::direct_py_cpu
84/183 Test #111: testcorenrn_gf::coreneuron_cpu_offline ............................. Passed 95.55 sec
Start 132: testcorenrn_vecplay::neuron
85/183 Test #175: channel_benchmark_sscx::coreneuron_cpu_filemode .................... Passed 108.29 sec
Start 20: coreneuron_modtests::direct_hoc_cpu
86/183 Test #174: channel_benchmark_sscx::coreneuron_cpu_online ...................... Passed 121.39 sec
Start 21: coreneuron_modtests::spikes_py_cpu
87/183 Test #173: channel_benchmark_sscx::coreneuron_gpu_filemode .................... Passed 194.02 sec
Start 22: coreneuron_modtests::spikes_file_mode_py_cpu
88/183 Test #132: testcorenrn_vecplay::neuron ........................................ Passed 2.75 sec
Start 133: testcorenrn_vecplay::coreneuron_gpu_online
89/183 Test #16: parallel_bas ....................................................... Passed 35.11 sec
Start 23: coreneuron_modtests::fast_imem_py_cpu
90/183 Test #110: testcorenrn_gf::coreneuron_cpu_online_psolve_alternate ............. Passed 122.19 sec
Start 134: testcorenrn_vecplay::coreneuron_gpu_online_psolve_alternate
91/183 Test #23: coreneuron_modtests::fast_imem_py_cpu .............................. Passed 2.29 sec
Start 24: coreneuron_modtests::datareturn_py_cpu
92/183 Test #129: testcorenrn_patstim::coreneuron_cpu_offline ........................ Passed 78.56 sec
Start 135: testcorenrn_vecplay::coreneuron_gpu_offline
93/183 Test #15: parallel_netpar .................................................... Passed 74.86 sec
Start 25: coreneuron_modtests::test_units_py_cpu
94/183 Test #18: coreneuron_modtests::fornetcon_py_cpu .............................. Passed 72.73 sec
Start 26: coreneuron_modtests::test_netmove_py_cpu
95/183 Test #17: coreneuron_modtests::version_macros ................................ Passed 74.04 sec
Start 27: coreneuron_modtests::test_pointer_py_cpu
96/183 Test #20: coreneuron_modtests::direct_hoc_cpu ................................ Passed 55.20 sec
Start 28: coreneuron_modtests::test_watchrange_py_cpu
97/183 Test #19: coreneuron_modtests::direct_py_cpu ................................. Passed 67.21 sec
Start 29: coreneuron_modtests::test_psolve_py_cpu
98/183 Test #21: coreneuron_modtests::spikes_py_cpu ................................. Passed 55.24 sec
Start 30: coreneuron_modtests::test_ba_py_cpu
99/183 Test #22: coreneuron_modtests::spikes_file_mode_py_cpu ....................... Passed 55.21 sec
Start 31: coreneuron_modtests::test_natrans_py_cpu
100/183 Test #128: testcorenrn_patstim::coreneuron_cpu_offline_saverestore ............ Passed 149.97 sec
Start 137: testcorenrn_vecplay::coreneuron_cpu_online
101/183 Test #24: coreneuron_modtests::datareturn_py_cpu ............................. Passed 49.05 sec
Start 35: coreneuron_modtests::fornetcon_py_gpu
102/183 Test #133: testcorenrn_vecplay::coreneuron_gpu_online ......................... Passed 59.32 sec
Start 138: testcorenrn_vecplay::coreneuron_cpu_online_psolve_alternate
103/183 Test #26: coreneuron_modtests::test_netmove_py_cpu ........................... Passed 30.90 sec
Start 36: coreneuron_modtests::direct_py_gpu
104/183 Test #135: testcorenrn_vecplay::coreneuron_gpu_offline ........................ Passed 40.98 sec
Start 139: testcorenrn_vecplay::coreneuron_cpu_offline
105/183 Test #25: coreneuron_modtests::test_units_py_cpu ............................. Passed 37.23 sec
Start 37: coreneuron_modtests::direct_hoc_gpu
106/183 Test #35: coreneuron_modtests::fornetcon_py_gpu .............................. Passed 37.93 sec
Start 38: coreneuron_modtests::spikes_py_gpu
107/183 Test #28: coreneuron_modtests::test_watchrange_py_cpu ........................ Passed 43.05 sec
Start 39: coreneuron_modtests::spikes_file_mode_py_gpu
108/183 Test #29: coreneuron_modtests::test_psolve_py_cpu ............................ Passed 43.15 sec
Start 40: coreneuron_modtests::fast_imem_py_gpu
109/183 Test #30: coreneuron_modtests::test_ba_py_cpu ................................ Passed 43.23 sec
Start 41: coreneuron_modtests::datareturn_py_gpu
110/183 Test #31: coreneuron_modtests::test_natrans_py_cpu ........................... Passed 43.33 sec
Start 42: coreneuron_modtests::test_units_py_gpu
111/183 Test #134: testcorenrn_vecplay::coreneuron_gpu_online_psolve_alternate ........ Passed 90.10 sec
Start 152: testcorenrn_watch::neuron
112/183 Test #40: coreneuron_modtests::fast_imem_py_gpu .............................. Passed 5.06 sec
Start 43: coreneuron_modtests::test_netmove_py_gpu
113/183 Test #152: testcorenrn_watch::neuron .......................................... Passed 5.37 sec
Start 153: testcorenrn_watch::coreneuron_gpu_online
114/183 Test #137: testcorenrn_vecplay::coreneuron_cpu_online ......................... Passed 48.02 sec
Start 154: testcorenrn_watch::coreneuron_gpu_online_psolve_alternate
115/183 Test #36: coreneuron_modtests::direct_py_gpu ................................. Passed 33.14 sec
Start 44: coreneuron_modtests::test_pointer_py_gpu
116/183 Test #138: testcorenrn_vecplay::coreneuron_cpu_online_psolve_alternate ........ Passed 51.21 sec
Start 155: testcorenrn_watch::coreneuron_gpu_offline
117/183 Test #139: testcorenrn_vecplay::coreneuron_cpu_offline ........................ Passed 37.51 sec
Start 157: testcorenrn_watch::coreneuron_cpu_online
118/183 Test #37: coreneuron_modtests::direct_hoc_gpu ................................ Passed 39.52 sec
Start 45: coreneuron_modtests::test_watchrange_py_gpu
119/183 Test #42: coreneuron_modtests::test_units_py_gpu ............................. Passed 36.34 sec
Start 46: coreneuron_modtests::test_psolve_py_gpu
120/183 Test #38: coreneuron_modtests::spikes_py_gpu ................................. Passed 37.22 sec
Start 47: coreneuron_modtests::test_ba_py_gpu
121/183 Test #39: coreneuron_modtests::spikes_file_mode_py_gpu ....................... Passed 37.15 sec
Start 48: coreneuron_modtests::test_natrans_py_gpu
122/183 Test #41: coreneuron_modtests::datareturn_py_gpu ............................. Passed 37.04 sec
Start 52: modlunit_unitstest
123/183 Test #52: modlunit_unitstest ................................................. Passed 0.02 sec
Start 53: modlunit_hh
124/183 Test #53: modlunit_hh ........................................................ Passed 0.03 sec
Start 54: modlunit_stim
125/183 Test #54: modlunit_stim ...................................................... Passed 0.01 sec
Start 55: modlunit_pattern
126/183 Test #55: modlunit_pattern ................................................... Passed 0.01 sec
Start 56: external_nrntest
127/183 Test #43: coreneuron_modtests::test_netmove_py_gpu ........................... Passed 36.01 sec
Start 83: testcorenrn_bbcore::compare_results
128/183 Test #83: testcorenrn_bbcore::compare_results ................................ Passed 0.06 sec
Start 93: testcorenrn_conc::compare_results
129/183 Test #93: testcorenrn_conc::compare_results .................................. Passed 0.05 sec
Start 103: testcorenrn_deriv::compare_results
130/183 Test #103: testcorenrn_deriv::compare_results ................................. Passed 0.06 sec
Start 113: testcorenrn_gf::compare_results
131/183 Test #113: testcorenrn_gf::compare_results .................................... Passed 0.06 sec
Start 123: testcorenrn_kin::compare_results
132/183 Test #123: testcorenrn_kin::compare_results ................................... Passed 0.06 sec
Start 131: testcorenrn_patstim::compare_results
133/183 Test #131: testcorenrn_patstim::compare_results ............................... Passed 0.06 sec
Start 141: testcorenrn_vecplay::compare_results
134/183 Test #141: testcorenrn_vecplay::compare_results ............................... Passed 0.06 sec
Start 170: channel_benchmark_hippo::compare_results
135/183 Test #170: channel_benchmark_hippo::compare_results ........................... Passed 0.06 sec
Start 176: channel_benchmark_sscx::compare_results
136/183 Test #176: channel_benchmark_sscx::compare_results ............................ Passed 0.06 sec
137/183 Test #153: testcorenrn_watch::coreneuron_gpu_online ........................... Passed 38.28 sec
Start 158: testcorenrn_watch::coreneuron_cpu_online_psolve_alternate
138/183 Test #45: coreneuron_modtests::test_watchrange_py_gpu ........................ Passed 28.78 sec
Start 159: testcorenrn_watch::coreneuron_cpu_offline
139/183 Test #155: testcorenrn_watch::coreneuron_gpu_offline .......................... Passed 38.81 sec
Start 162: testcorenrn_netstimdirect::direct_netstimdirect
140/183 Test #157: testcorenrn_watch::coreneuron_cpu_online ........................... Passed 33.72 sec
Start 163: testcorenrn_netstimdirect::direct_netstimdirect_psolve_alternate
141/183 Test #154: testcorenrn_watch::coreneuron_gpu_online_psolve_alternate .......... Passed 47.73 sec
Start 32: coreneuron_modtests::spikes_mpi_py_cpu
142/183 Test #46: coreneuron_modtests::test_psolve_py_gpu ............................ Passed 37.56 sec
143/183 Test #47: coreneuron_modtests::test_ba_py_gpu ................................ Passed 37.38 sec
Start 33: coreneuron_modtests::spikes_mpi_file_mode_py_cpu
144/183 Test #48: coreneuron_modtests::test_natrans_py_gpu ........................... Passed 37.42 sec
145/183 Test #158: testcorenrn_watch::coreneuron_cpu_online_psolve_alternate .......... Passed 35.15 sec
Start 34: coreneuron_modtests::inputpresyn_py_cpu
146/183 Test #159: testcorenrn_watch::coreneuron_cpu_offline .......................... Passed 40.79 sec
Start 49: coreneuron_modtests::spikes_mpi_py_gpu
Start 161: testcorenrn_watch::compare_results
147/183 Test #161: testcorenrn_watch::compare_results ................................. Passed 0.22 sec
148/183 Test #162: testcorenrn_netstimdirect::direct_netstimdirect .................... Passed 40.92 sec
Start 50: coreneuron_modtests::spikes_mpi_file_mode_py_gpu
149/183 Test #163: testcorenrn_netstimdirect::direct_netstimdirect_psolve_alternate ... Passed 43.92 sec
Start 51: coreneuron_modtests::inputpresyn_py_gpu
Start 164: testcorenrn_netstimdirect::compare_results
150/183 Test #32: coreneuron_modtests::spikes_mpi_py_cpu ............................. Passed 43.68 sec
151/183 Test #164: testcorenrn_netstimdirect::compare_results ......................... Passed 1.16 sec
152/183 Test #33: coreneuron_modtests::spikes_mpi_file_mode_py_cpu ................... Passed 32.89 sec
Start 146: testcorenrn_vecevent::coreneuron_gpu_offline::preparation
153/183 Test #146: testcorenrn_vecevent::coreneuron_gpu_offline::preparation .......... Passed 6.30 sec
Start 150: testcorenrn_vecevent::coreneuron_cpu_offline::preparation
154/183 Test #34: coreneuron_modtests::inputpresyn_py_cpu ............................ Passed 35.30 sec
155/183 Test #150: testcorenrn_vecevent::coreneuron_cpu_offline::preparation .......... Passed 2.44 sec
Start 178: olfactory-bulb-3d::neuron::preparation
156/183 Test #178: olfactory-bulb-3d::neuron::preparation ............................. Passed 0.02 sec
Start 180: olfactory-bulb-3d::coreneuron_gpu_online::preparation
157/183 Test #180: olfactory-bulb-3d::coreneuron_gpu_online::preparation .............. Passed 0.01 sec
Start 182: olfactory-bulb-3d::coreneuron_cpu_online::preparation
158/183 Test #182: olfactory-bulb-3d::coreneuron_cpu_online::preparation .............. Passed 1.21 sec
Start 57: reduced_dentate::neuron
159/183 Test #49: coreneuron_modtests::spikes_mpi_py_gpu ............................. Passed 28.31 sec
Start 58: reduced_dentate::coreneuron_cpu
160/183 Test #51: coreneuron_modtests::inputpresyn_py_gpu ............................ Passed 26.51 sec
161/183 Test #50: coreneuron_modtests::spikes_mpi_file_mode_py_gpu ................... Passed 30.21 sec
Start 59: reduced_dentate::coreneuron_gpu
162/183 Test #57: reduced_dentate::neuron ............................................ Passed 52.83 sec
Start 142: testcorenrn_vecevent::neuron
163/183 Test #56: external_nrntest ................................................... Passed 132.99 sec
164/183 Test #27: coreneuron_modtests::test_pointer_py_cpu ........................... Passed 222.40 sec
165/183 Test #142: testcorenrn_vecevent::neuron ....................................... Passed 1.79 sec
Start 67: external_ringtest::coreneuron_cpu_mpi_threads
166/183 Test #58: reduced_dentate::coreneuron_cpu .................................... Passed 55.51 sec
Start 143: testcorenrn_vecevent::coreneuron_gpu_online
167/183 Test #59: reduced_dentate::coreneuron_gpu .................................... Passed 59.79 sec
Start 144: testcorenrn_vecevent::coreneuron_gpu_online_psolve_alternate
Start 60: reduced_dentate::compare_results
168/183 Test #60: reduced_dentate::compare_results ................................... Passed 0.06 sec
169/183 Test #44: coreneuron_modtests::test_pointer_py_gpu ........................... Passed 184.06 sec
170/183 Test #67: external_ringtest::coreneuron_cpu_mpi_threads ...................... Passed 26.63 sec
Start 72: external_ringtest::coreneuron_gpu_mpi_threads
171/183 Test #144: testcorenrn_vecevent::coreneuron_gpu_online_psolve_alternate ....... Passed 25.23 sec
Start 145: testcorenrn_vecevent::coreneuron_gpu_offline
172/183 Test #143: testcorenrn_vecevent::coreneuron_gpu_online ........................ Passed 51.67 sec
Start 147: testcorenrn_vecevent::coreneuron_cpu_online
173/183 Test #145: testcorenrn_vecevent::coreneuron_gpu_offline ....................... Passed 21.37 sec
Start 148: testcorenrn_vecevent::coreneuron_cpu_online_psolve_alternate
174/183 Test #147: testcorenrn_vecevent::coreneuron_cpu_online ........................ Passed 6.56 sec
Start 149: testcorenrn_vecevent::coreneuron_cpu_offline
175/183 Test #148: testcorenrn_vecevent::coreneuron_cpu_online_psolve_alternate ....... Passed 17.06 sec
Start 177: olfactory-bulb-3d::neuron
176/183 Test #149: testcorenrn_vecevent::coreneuron_cpu_offline ....................... Passed 24.00 sec
Start 179: olfactory-bulb-3d::coreneuron_gpu_online
Start 151: testcorenrn_vecevent::compare_results
177/183 Test #151: testcorenrn_vecevent::compare_results .............................. Passed 0.16 sec
178/183 Test #72: external_ringtest::coreneuron_gpu_mpi_threads ...................... Passed 87.88 sec
Start 181: olfactory-bulb-3d::coreneuron_cpu_online
Start 73: external_ringtest::compare_results
179/183 Test #73: external_ringtest::compare_results ................................. Passed 0.35 sec
180/183 Test #177: olfactory-bulb-3d::neuron .......................................... Passed 191.70 sec
181/183 Test #179: olfactory-bulb-3d::coreneuron_gpu_online ........................... Passed 188.99 sec
182/183 Test #181: olfactory-bulb-3d::coreneuron_cpu_online ........................... Passed 169.92 sec
Start 183: olfactory-bulb-3d::compare_results
183/183 Test #183: olfactory-bulb-3d::compare_results ................................. Passed 0.23 sec
100% tests passed, 0 tests failed out of 183
Total Test time (real) = 946.26 sec
$ cp -r Testing/ ${CI_PROJECT_DIR}/
$ module load unstable unit-test-translator
Autoloading python/3.9.7
$ cmake2junit > ${CI_PROJECT_DIR}/ctest.xml
$ exit ${i_am_a_failure}
section_end:1649846824:step_script section_start:1649846824:upload_artifacts_on_success Uploading artifacts for successful job
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=10964 revision=58ba2b95 version=14.2.0
initial_environment.env: found 1 matching files and directories
Testing/: found 7 matching files and directories 
Uploading artifacts as "archive" to coordinator... ok id=211037 responseStatus=201 Created token=6tft7wEv
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=11018 revision=58ba2b95 version=14.2.0
ctest.xml: found 1 matching files and directories 
Uploading artifacts as "junit" to coordinator... ok id=211037 responseStatus=201 Created token=6tft7wEv
section_end:1649846825:upload_artifacts_on_success section_start:1649846825:cleanup_file_variables Cleaning up project directory and file based variables
section_end:1649846826:cleanup_file_variables Job succeeded
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment