Skip to content
Failed

Console Output

Pull request #743 opened
15:38:53 Connecting to https://api.github.com using 476720/******
Obtained .jenkins from 36c1cd7e140129baebe5740431d07527be2799a4
[Pipeline] Start of Pipeline
[Pipeline] withEnv
[Pipeline] {
[Pipeline] timeout
Timeout set to expire in 3 hr 0 min
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Build)
[Pipeline] parallel
[Pipeline] { (Branch: CUDA-11-NVCC-DEBUG)
[Pipeline] { (Branch: ROCM-5.2-HIPCC-DEBUG)
[Pipeline] { (Branch: SYCL)
[Pipeline] stage
[Pipeline] { (CUDA-11-NVCC-DEBUG)
[Pipeline] stage
[Pipeline] { (ROCM-5.2-HIPCC-DEBUG)
[Pipeline] stage
[Pipeline] { (SYCL)
[Pipeline] node
[Pipeline] node
[Pipeline] node
Running on fetnat06 in /var/jenkins/workspace/Cabana_PR-743
[Pipeline] {
Running on fetnat04 in /var/jenkins/workspace/Cabana_PR-743
[Pipeline] checkout
The recommended git tool is: NONE
using credential Jenkins ORNL
[Pipeline] {
Cloning the remote Git repository
Cloning with configured refspecs honoured and without tags
[Pipeline] checkout
The recommended git tool is: NONE
using credential Jenkins ORNL
Cloning the remote Git repository
Cloning with configured refspecs honoured and without tags
Cloning repository https://github.com/ECP-copa/Cabana.git
 > git init /var/jenkins/workspace/Cabana_PR-743 # timeout=10
Fetching upstream changes from https://github.com/ECP-copa/Cabana.git
 > git --version # timeout=10
 > git --version # 'git version 2.17.1'
using GIT_ASKPASS to set credentials 
 > git fetch --no-tags --progress -- https://github.com/ECP-copa/Cabana.git +refs/pull/743/head:refs/remotes/origin/PR-743 # timeout=10
Cloning repository https://github.com/ECP-copa/Cabana.git
 > git init /var/jenkins/workspace/Cabana_PR-743 # timeout=10
Fetching upstream changes from https://github.com/ECP-copa/Cabana.git
 > git --version # timeout=10
 > git --version # 'git version 2.17.1'
using GIT_ASKPASS to set credentials 
 > git fetch --no-tags --progress -- https://github.com/ECP-copa/Cabana.git +refs/pull/743/head:refs/remotes/origin/PR-743 # timeout=10
Fetching without tags
Fetching without tags
Checking out Revision 36c1cd7e140129baebe5740431d07527be2799a4 (PR-743)
Commit message: "fixup: allow intel to fail due to recurring CI issues"
First time build. Skipping changelog.
Checking out Revision 36c1cd7e140129baebe5740431d07527be2799a4 (PR-743)
 > git config remote.origin.url https://github.com/ECP-copa/Cabana.git # timeout=10
 > git config --add remote.origin.fetch +refs/pull/743/head:refs/remotes/origin/PR-743 # timeout=10
 > git config remote.origin.url https://github.com/ECP-copa/Cabana.git # timeout=10
Fetching upstream changes from https://github.com/ECP-copa/Cabana.git
using GIT_ASKPASS to set credentials 
 > git fetch --no-tags --progress -- https://github.com/ECP-copa/Cabana.git +refs/pull/743/head:refs/remotes/origin/PR-743 # timeout=10
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 36c1cd7e140129baebe5740431d07527be2799a4 # timeout=10
 > git config remote.origin.url https://github.com/ECP-copa/Cabana.git # timeout=10
 > git config --add remote.origin.fetch +refs/pull/743/head:refs/remotes/origin/PR-743 # timeout=10
 > git config remote.origin.url https://github.com/ECP-copa/Cabana.git # timeout=10
Fetching upstream changes from https://github.com/ECP-copa/Cabana.git
using GIT_ASKPASS to set credentials 
 > git fetch --no-tags --progress -- https://github.com/ECP-copa/Cabana.git +refs/pull/743/head:refs/remotes/origin/PR-743 # timeout=10
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 36c1cd7e140129baebe5740431d07527be2799a4 # timeout=10
Commit message: "fixup: allow intel to fail due to recurring CI issues"
[Pipeline] withEnv
[Pipeline] {
[Pipeline] isUnix
[Pipeline] readFile
[Pipeline] sh
+ docker build -t 578e5e7aaf741e8041f57a4fbc9d1613efeacbf3 --build-arg BASE=nvidia/cuda:11.0.3-devel-ubuntu20.04 -f docker/Dockerfile docker
DEPRECATED: The legacy builder is deprecated and will be removed in a future release.
            Install the buildx component to build images with BuildKit:
            https://docs.docker.com/go/buildx/

Sending build context to Docker daemon  19.46kB

Step 1/25 : ARG BASE=nvidia/cuda:11.0.3-devel-ubuntu20.04
Step 2/25 : FROM $BASE
[Pipeline] withEnv
[Pipeline] {
[Pipeline] isUnix
[Pipeline] readFile
[Pipeline] sh
+ docker build -t f1543b7005e5744b73e10444c0db9dc00eeab29c -f docker/Dockerfile.sycl docker
DEPRECATED: The legacy builder is deprecated and will be removed in a future release.
            Install the buildx component to build images with BuildKit:
            https://docs.docker.com/go/buildx/

Sending build context to Docker daemon  19.46kB

Step 1/18 : ARG BASE=nvidia/cuda:11.0.3-devel-ubuntu20.04
Step 2/18 : FROM $BASE
11.0.3-devel-ubuntu20.04: Pulling from nvidia/cuda
96d54c3075c9: Already exists
59f6381879f6: Pulling fs layer
655ed0df26cf: Pulling fs layer
848b95ad96b5: Pulling fs layer
e43c2058e496: Pulling fs layer
aa99269ca9c6: Pulling fs layer
1731132f4b49: Pulling fs layer
51c57cc69f67: Pulling fs layer
6aa50616693c: Pulling fs layer
b4c054bf4c8a: Pulling fs layer
5ae971429117: Pulling fs layer
1731132f4b49: Waiting
e43c2058e496: Waiting
51c57cc69f67: Waiting
6aa50616693c: Waiting
b4c054bf4c8a: Waiting
5ae971429117: Waiting
aa99269ca9c6: Waiting
11.0.3-devel-ubuntu20.04: Pulling from nvidia/cuda
96d54c3075c9: Already exists
848b95ad96b5: Verifying Checksum
848b95ad96b5: Download complete
655ed0df26cf: Verifying Checksum
655ed0df26cf: Download complete
e43c2058e496: Verifying Checksum
e43c2058e496: Download complete
59f6381879f6: Verifying Checksum
59f6381879f6: Download complete
59f6381879f6: Pulling fs layer
655ed0df26cf: Pulling fs layer
848b95ad96b5: Pulling fs layer
e43c2058e496: Pulling fs layer
aa99269ca9c6: Pulling fs layer
1731132f4b49: Pulling fs layer
51c57cc69f67: Pulling fs layer
6aa50616693c: Pulling fs layer
b4c054bf4c8a: Pulling fs layer
5ae971429117: Pulling fs layer
e43c2058e496: Waiting
aa99269ca9c6: Waiting
1731132f4b49: Waiting
51c57cc69f67: Waiting
6aa50616693c: Waiting
b4c054bf4c8a: Waiting
5ae971429117: Waiting
848b95ad96b5: Verifying Checksum
848b95ad96b5: Download complete
59f6381879f6: Verifying Checksum
59f6381879f6: Download complete
655ed0df26cf: Verifying Checksum
655ed0df26cf: Download complete
51c57cc69f67: Verifying Checksum
51c57cc69f67: Download complete
1731132f4b49: Verifying Checksum
1731132f4b49: Download complete
6aa50616693c: Verifying Checksum
6aa50616693c: Download complete
e43c2058e496: Verifying Checksum
e43c2058e496: Download complete
59f6381879f6: Pull complete
1731132f4b49: Verifying Checksum
1731132f4b49: Download complete
51c57cc69f67: Verifying Checksum
51c57cc69f67: Download complete
5ae971429117: Verifying Checksum
5ae971429117: Download complete
6aa50616693c: Verifying Checksum
6aa50616693c: Download complete
655ed0df26cf: Pull complete
5ae971429117: Verifying Checksum
5ae971429117: Download complete
848b95ad96b5: Pull complete
e43c2058e496: Pull complete
59f6381879f6: Pull complete
655ed0df26cf: Pull complete
848b95ad96b5: Pull complete
e43c2058e496: Pull complete
Still waiting to schedule task
There are no nodes with the label ‘rocm-docker&&vega&&AMD_Radeon_Instinct_MI60aa99269ca9c6: Verifying Checksum
aa99269ca9c6: Download complete
b4c054bf4c8a: Verifying Checksum
b4c054bf4c8a: Download complete
aa99269ca9c6: Download complete
b4c054bf4c8a: Verifying Checksum
b4c054bf4c8a: Download complete
aa99269ca9c6: Pull complete
1731132f4b49: Pull complete
51c57cc69f67: Pull complete
6aa50616693c: Pull complete
b4c054bf4c8a: Pull complete
5ae971429117: Pull complete
Digest: sha256:10ab0f09fcdc796b4a2325ef1bce8f766f4a3500eab5a83780f80475ae26c7a6
Status: Downloaded newer image for nvidia/cuda:11.0.3-devel-ubuntu20.04
 ---> 66deaf56c203
Step 3/25 : ARG NPROCS=4
 ---> Running in f17a208d08ed
aa99269ca9c6: Pull complete
1731132f4b49: Pull complete
51c57cc69f67: Pull complete
6aa50616693c: Pull complete
 ---> Removed intermediate container f17a208d08ed
 ---> eb4521c67836
Step 4/25 : RUN DISTRO=ubuntu2004 &&     apt-key adv --fetch-keys https://developer.download.nvidia.com/compute/cuda/repos/$DISTRO/x86_64/3bf863cc.pub
 ---> Running in 73a8ef706398
Warning: apt-key output should not be parsed (stdout is not a terminal)
Executing: /tmp/apt-key-gpghome.SCaXHJNeHn/gpg.1.sh --fetch-keys https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64/3bf863cc.pub
gpg: requesting key from 'https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64/3bf863cc.pub'
gpg: key A4B469963BF863CC: "cudatools <cudatools@nvidia.com>" not changed
gpg: Total number processed: 1
gpg:              unchanged: 1
 ---> Removed intermediate container 73a8ef706398
 ---> d66077b85dca
Step 5/25 : RUN apt-get update && apt-get install -y         bc         ccache         wget         openssh-client         libgtest-dev     &&     apt-get clean &&     rm -rf /var/lib/apt/lists/*
 ---> Running in cb008501d04e
Get:1 http://security.ubuntu.com/ubuntu focal-security InRelease [114 kB]
Get:2 https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64  InRelease [1581 B]
Get:3 http://archive.ubuntu.com/ubuntu focal InRelease [265 kB]
Get:4 https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64  Packages [1461 kB]
Get:5 http://archive.ubuntu.com/ubuntu focal-updates InRelease [114 kB]
Get:6 http://archive.ubuntu.com/ubuntu focal-backports InRelease [108 kB]
Get:7 http://security.ubuntu.com/ubuntu focal-security/universe amd64 Packages [1194 kB]
Get:8 http://archive.ubuntu.com/ubuntu focal/universe amd64 Packages [11.3 MB]
Get:9 http://security.ubuntu.com/ubuntu focal-security/multiverse amd64 Packages [29.7 kB]
Get:10 http://security.ubuntu.com/ubuntu focal-security/main amd64 Packages [3492 kB]
Get:11 http://security.ubuntu.com/ubuntu focal-security/restricted amd64 Packages [3424 kB]
Get:12 http://archive.ubuntu.com/ubuntu focal/main amd64 Packages [1275 kB]
Get:13 http://archive.ubuntu.com/ubuntu focal/restricted amd64 Packages [33.4 kB]
Get:14 http://archive.ubuntu.com/ubuntu focal/multiverse amd64 Packages [177 kB]
Get:15 http://archive.ubuntu.com/ubuntu focal-updates/multiverse amd64 Packages [32.4 kB]
Get:16 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 Packages [3967 kB]
Get:17 http://archive.ubuntu.com/ubuntu focal-updates/universe amd64 Packages [1489 kB]
Get:18 http://archive.ubuntu.com/ubuntu focal-updates/restricted amd64 Packages [3574 kB]
Get:19 http://archive.ubuntu.com/ubuntu focal-backports/universe amd64 Packages [28.6 kB]
Get:20 http://archive.ubuntu.com/ubuntu focal-backports/main amd64 Packages [55.2 kB]
Fetched 32.2 MB in 2s (12.9 MB/s)
Reading package lists...
Reading package lists...
Building dependency tree...
Reading state information...
The following additional packages will be installed:
  googletest krb5-locales libcbor0.6 libfido2-1 libgssapi-krb5-2 libk5crypto3
  libkeyutils1 libkrb5-3 libkrb5support0 libpsl5 libxmuu1 publicsuffix xauth
Suggested packages:
  distcc | icecc krb5-doc krb5-user keychain libpam-ssh monkeysphere
  ssh-askpass
The following NEW packages will be installed:
  bc ccache googletest krb5-locales libcbor0.6 libfido2-1 libgssapi-krb5-2
  libgtest-dev libk5crypto3 libkeyutils1 libkrb5-3 libkrb5support0 libpsl5
  libxmuu1 openssh-client publicsuffix wget xauth
0 upgraded, 18 newly installed, 0 to remove and 38 not upgraded.
Need to get 4282 kB of archives.
After this operation, 30.4 MB of additional disk space will be used.
Get:1 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 krb5-locales all 1.17-6ubuntu4.4 [11.5 kB]
Get:2 http://archive.ubuntu.com/ubuntu focal/main amd64 libcbor0.6 amd64 0.6.0-0ubuntu1 [21.1 kB]
Get:3 http://archive.ubuntu.com/ubuntu focal/main amd64 libfido2-1 amd64 1.3.1-1ubuntu2 [47.9 kB]
Get:4 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 libkrb5support0 amd64 1.17-6ubuntu4.4 [31.0 kB]
Get:5 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 libk5crypto3 amd64 1.17-6ubuntu4.4 [79.9 kB]
Get:6 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 libkeyutils1 amd64 1.6-6ubuntu1.1 [10.3 kB]
Get:7 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 libkrb5-3 amd64 1.17-6ubuntu4.4 [330 kB]
Get:8 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 libgssapi-krb5-2 amd64 1.17-6ubuntu4.4 [121 kB]
Get:9 http://archive.ubuntu.com/ubuntu focal/main amd64 libpsl5 amd64 0.21.0-1ubuntu1 [51.5 kB]
Get:10 http://archive.ubuntu.com/ubuntu focal/main amd64 libxmuu1 amd64 2:1.1.3-0ubuntu1 [9728 B]
Get:11 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 openssh-client amd64 1:8.2p1-4ubuntu0.11 [670 kB]
Get:12 http://archive.ubuntu.com/ubuntu focal/main amd64 publicsuffix all 20200303.0012-1 [111 kB]
Get:13 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 wget amd64 1.20.3-1ubuntu2 [348 kB]
Get:14 http://archive.ubuntu.com/ubuntu focal/main amd64 xauth amd64 1:1.1-0ubuntu1 [25.0 kB]
Get:15 http://archive.ubuntu.com/ubuntu focal/main amd64 bc amd64 1.07.1-2build1 [86.3 kB]
Get:16 http://archive.ubuntu.com/ubuntu focal/main amd64 ccache amd64 3.7.7-1 [121 kB]
Get:17 http://archive.ubuntu.com/ubuntu focal/universe amd64 googletest all 1.10.0-2 [623 kB]
Get:18 http://archive.ubuntu.com/ubuntu focal/universe amd64 libgtest-dev amd64 1.10.0-2 [1583 kB]
debconf: delaying package configuration, since apt-utils is not installed
Fetched 4282 kB in 2s (1837 kB/s)
Selecting previously unselected package krb5-locales.
(Reading database ... 
(Reading database ... 5%
(Reading database ... 10%
(Reading database ... 15%
(Reading database ... 20%
(Reading database ... 25%
(Reading database ... 30%
(Reading database ... 35%
(Reading database ... 40%
(Reading database ... 45%
(Reading database ... 50%
(Reading database ... 55%
(Reading database ... 60%
(Reading database ... 65%
(Reading database ... 70%
(Reading database ... 75%
(Reading database ... 80%
(Reading database ... 85%
(Reading database ... 90%
(Reading database ... 95%
(Reading database ... 100%
(Reading database ... 14332 files and directories currently installed.)
Preparing to unpack .../00-krb5-locales_1.17-6ubuntu4.4_all.deb ...
Unpacking krb5-locales (1.17-6ubuntu4.4) ...
Selecting previously unselected package libcbor0.6:amd64.
Preparing to unpack .../01-libcbor0.6_0.6.0-0ubuntu1_amd64.deb ...
Unpacking libcbor0.6:amd64 (0.6.0-0ubuntu1) ...
Selecting previously unselected package libfido2-1:amd64.
Preparing to unpack .../02-libfido2-1_1.3.1-1ubuntu2_amd64.deb ...
Unpacking libfido2-1:amd64 (1.3.1-1ubuntu2) ...
Selecting previously unselected package libkrb5support0:amd64.
Preparing to unpack .../03-libkrb5support0_1.17-6ubuntu4.4_amd64.deb ...
Unpacking libkrb5support0:amd64 (1.17-6ubuntu4.4) ...
Selecting previously unselected package libk5crypto3:amd64.
Preparing to unpack .../04-libk5crypto3_1.17-6ubuntu4.4_amd64.deb ...
Unpacking libk5crypto3:amd64 (1.17-6ubuntu4.4) ...
Selecting previously unselected package libkeyutils1:amd64.
Preparing to unpack .../05-libkeyutils1_1.6-6ubuntu1.1_amd64.deb ...
Unpacking libkeyutils1:amd64 (1.6-6ubuntu1.1) ...
Selecting previously unselected package libkrb5-3:amd64.
Preparing to unpack .../06-libkrb5-3_1.17-6ubuntu4.4_amd64.deb ...
Unpacking libkrb5-3:amd64 (1.17-6ubuntu4.4) ...
Selecting previously unselected package libgssapi-krb5-2:amd64.
Preparing to unpack .../07-libgssapi-krb5-2_1.17-6ubuntu4.4_amd64.deb ...
Unpacking libgssapi-krb5-2:amd64 (1.17-6ubuntu4.4) ...
Selecting previously unselected package libpsl5:amd64.
Preparing to unpack .../08-libpsl5_0.21.0-1ubuntu1_amd64.deb ...
Unpacking libpsl5:amd64 (0.21.0-1ubuntu1) ...
Selecting previously unselected package libxmuu1:amd64.
Preparing to unpack .../09-libxmuu1_2%3a1.1.3-0ubuntu1_amd64.deb ...
Unpacking libxmuu1:amd64 (2:1.1.3-0ubuntu1) ...
Selecting previously unselected package openssh-client.
Preparing to unpack .../10-openssh-client_1%3a8.2p1-4ubuntu0.11_amd64.deb ...
Unpacking openssh-client (1:8.2p1-4ubuntu0.11) ...
Selecting previously unselected package publicsuffix.
Preparing to unpack .../11-publicsuffix_20200303.0012-1_all.deb ...
Unpacking publicsuffix (20200303.0012-1) ...
Selecting previously unselected package wget.
Preparing to unpack .../12-wget_1.20.3-1ubuntu2_amd64.deb ...
Unpacking wget (1.20.3-1ubuntu2) ...
Selecting previously unselected package xauth.
Preparing to unpack .../13-xauth_1%3a1.1-0ubuntu1_amd64.deb ...
Unpacking xauth (1:1.1-0ubuntu1) ...
Selecting previously unselected package bc.
Preparing to unpack .../14-bc_1.07.1-2build1_amd64.deb ...
Unpacking bc (1.07.1-2build1) ...
Selecting previously unselected package ccache.
Preparing to unpack .../15-ccache_3.7.7-1_amd64.deb ...
Unpacking ccache (3.7.7-1) ...
Selecting previously unselected package googletest.
Preparing to unpack .../16-googletest_1.10.0-2_all.deb ...
Unpacking googletest (1.10.0-2) ...
Selecting previously unselected package libgtest-dev:amd64.
Preparing to unpack .../17-libgtest-dev_1.10.0-2_amd64.deb ...
Unpacking libgtest-dev:amd64 (1.10.0-2) ...
Setting up libkeyutils1:amd64 (1.6-6ubuntu1.1) ...
Setting up libpsl5:amd64 (0.21.0-1ubuntu1) ...
Setting up wget (1.20.3-1ubuntu2) ...
Setting up ccache (3.7.7-1) ...
Updating symlinks in /usr/lib/ccache ...
Setting up bc (1.07.1-2build1) ...
Setting up krb5-locales (1.17-6ubuntu4.4) ...
Setting up libcbor0.6:amd64 (0.6.0-0ubuntu1) ...
Setting up googletest (1.10.0-2) ...
Setting up libkrb5support0:amd64 (1.17-6ubuntu4.4) ...
Setting up libk5crypto3:amd64 (1.17-6ubuntu4.4) ...
Setting up libkrb5-3:amd64 (1.17-6ubuntu4.4) ...
Setting up libfido2-1:amd64 (1.3.1-1ubuntu2) ...
Setting up publicsuffix (20200303.0012-1) ...
Setting up libxmuu1:amd64 (2:1.1.3-0ubuntu1) ...
Setting up libgtest-dev:amd64 (1.10.0-2) ...
Setting up libgssapi-krb5-2:amd64 (1.17-6ubuntu4.4) ...
Setting up xauth (1:1.1-0ubuntu1) ...
Setting up openssh-client (1:8.2p1-4ubuntu0.11) ...
Processing triggers for libc-bin (2.31-0ubuntu9.12) ...
 ---> Removed intermediate container cb008501d04e
 ---> f97937143bd9
Step 6/25 : RUN KEYDUMP_URL=https://cloud.cees.ornl.gov/download &&     KEYDUMP_FILE=keydump &&     wget --quiet ${KEYDUMP_URL}/${KEYDUMP_FILE} &&     wget --quiet ${KEYDUMP_URL}/${KEYDUMP_FILE}.sig &&     gpg --import ${KEYDUMP_FILE} &&     gpg --verify ${KEYDUMP_FILE}.sig ${KEYDUMP_FILE} &&     rm ${KEYDUMP_FILE}*
 ---> Running in 3a1274bfcc1e
gpg: directory '/root/.gnupg' created
gpg: keybox '/root/.gnupg/pubring.kbx' created
gpg: /root/.gnupg/trustdb.gpg: trustdb created
gpg: key 48822FDA51C1DA7A: public key "Damien Lebrun-Grandie <dalg24@gmail.com>" imported
gpg: key A2C794A986419D8A: public key "Tom Stellard <tstellar@redhat.com>" imported
gpg: key 0FC3042E345AD05D: public key "Hans Wennborg <hans@chromium.org>" imported
gpg: key EC8FEF3A7BFB4EDA: 24 signatures not checked due to missing keys
gpg: key EC8FEF3A7BFB4EDA: public key "Brad King" imported
gpg: key 379CE192D401AB61: public key "Bintray (by JFrog) <bintray@bintray.com>" imported
gpg: Total number processed: 5
gpg:               imported: 5
gpg: no ultimately trusted keys found
gpg: Signature made Thu May  7 23:44:59 2020 UTC
gpg:                using RSA key 061CFF3BA41AA45D25BCE7097A0994F834C86684
gpg: Good signature from "Damien Lebrun-Grandie <dalg24@gmail.com>" [unknown]
gpg: WARNING: This key is not certified with a trusted signature!
gpg:          There is no indication that the signature belongs to the owner.
Primary key fingerprint: E226 98C7 0BF0 7BDA 37E1  4154 4882 2FDA 51C1 DA7A
     Subkey fingerprint: 061C FF3B A41A A45D 25BC  E709 7A09 94F8 34C8 6684
 ---> Removed intermediate container 3a1274bfcc1e
 ---> 343a776272f1
Step 7/25 : ENV CMAKE_DIR=/opt/cmake
 ---> Running in 8a4f17648447
 ---> Removed intermediate container 8a4f17648447
 ---> 3399127e9838
Step 8/25 : RUN CMAKE_VERSION=3.16.9 &&     CMAKE_KEY=2D2CEF1034921684 &&     CMAKE_URL=https://github.com/Kitware/CMake/releases/download/v${CMAKE_VERSION} &&     CMAKE_SCRIPT=cmake-${CMAKE_VERSION}-Linux-x86_64.sh &&     CMAKE_SHA256=cmake-${CMAKE_VERSION}-SHA-256.txt &&     wget --quiet ${CMAKE_URL}/${CMAKE_SHA256} &&     wget --quiet ${CMAKE_URL}/${CMAKE_SHA256}.asc &&     wget --quiet ${CMAKE_URL}/${CMAKE_SCRIPT} &&     gpg --verify ${CMAKE_SHA256}.asc ${CMAKE_SHA256} &&     grep ${CMAKE_SCRIPT} ${CMAKE_SHA256} | sha256sum --check &&     mkdir -p ${CMAKE_DIR} &&     sh ${CMAKE_SCRIPT} --skip-license --prefix=${CMAKE_DIR} &&     rm ${CMAKE_SCRIPT}
 ---> Running in 0d4c2be3dda5
gpg: Signature made Tue Sep 15 13:13:30 2020 UTC
gpg:                using RSA key C6C265324BBEBDC350B513D02D2CEF1034921684
gpg: Good signature from "Brad King" [unknown]
gpg:                 aka "Brad King <brad.king@kitware.com>" [unknown]
gpg:                 aka "[jpeg image of size 4005]" [unknown]
gpg: Note: This key has expired!
Primary key fingerprint: CBA2 3971 357C 2E65 90D9  EFD3 EC8F EF3A 7BFB 4EDA
     Subkey fingerprint: C6C2 6532 4BBE BDC3 50B5  13D0 2D2C EF10 3492 1684
cmake-3.16.9-Linux-x86_64.sh: OK
CMake Installer Version: 3.16.9, Copyright (c) Kitware
This is a self-extracting archive.
The archive will be extracted to: /opt/cmake

Using target directory: /opt/cmake
Extracting, please wait...

Unpacking finished successfully
b4c054bf4c8a: Pull complete
5ae971429117: Pull complete
Digest: sha256:10ab0f09fcdc796b4a2325ef1bce8f766f4a3500eab5a83780f80475ae26c7a6
Status: Downloaded newer image for nvidia/cuda:11.0.3-devel-ubuntu20.04
 ---> 66deaf56c203
Step 3/18 : ARG NPROCS=4
 ---> Running in ee1fda7c84d8
 ---> Removed intermediate container 0d4c2be3dda5
 ---> 7a96946e97f6
Step 9/25 : ENV PATH=${CMAKE_DIR}/bin:$PATH
 ---> Running in f5fb4f6b37be
 ---> Removed intermediate container f5fb4f6b37be
 ---> 99fa69e93b56
Step 10/25 : ENV OPENMPI_DIR=/opt/openmpi
 ---> Running in 7cbc103bae6f
 ---> Removed intermediate container 7cbc103bae6f
 ---> ae6eaaac1a9c
Step 11/25 : RUN OPENMPI_VERSION=4.0.2 &&     OPENMPI_VERSION_SHORT=4.0 &&     OPENMPI_SHA1=32ce3761288575fb8e4f6296c9105c3a25cf3235 &&     OPENMPI_URL=https://download.open-mpi.org/release/open-mpi/v${OPENMPI_VERSION_SHORT}/openmpi-${OPENMPI_VERSION}.tar.bz2 &&     OPENMPI_ARCHIVE=openmpi-${OPENMPI_VERSION}.tar.bz2 &&     [ ! -z "${CUDA_VERSION}" ] && CUDA_OPTIONS=--with-cuda || true &&     SCRATCH_DIR=/scratch && mkdir -p ${SCRATCH_DIR} && cd ${SCRATCH_DIR} &&     wget --quiet ${OPENMPI_URL} --output-document=${OPENMPI_ARCHIVE} &&     echo "${OPENMPI_SHA1} ${OPENMPI_ARCHIVE}" | sha1sum -c &&     mkdir -p openmpi &&     tar -xf ${OPENMPI_ARCHIVE} -C openmpi --strip-components=1 &&     mkdir -p build && cd build &&     ../openmpi/configure --prefix=${OPENMPI_DIR} ${CUDA_OPTIONS} CFLAGS=-w &&     make -j${NPROCS} install &&     rm -rf ${SCRATCH_DIR}
 ---> Running in 82e0ab48e4bf
openmpi-4.0.2.tar.bz2: OK
 ---> Removed intermediate container ee1fda7c84d8
 ---> 3f0a296c6745
Step 4/18 : RUN DISTRO=ubuntu2004 &&     apt-key adv --fetch-keys https://developer.download.nvidia.com/compute/cuda/repos/$DISTRO/x86_64/3bf863cc.pub
 ---> Running in d4cfdb93286e
Warning: apt-key output should not be parsed (stdout is not a terminal)
Executing: /tmp/apt-key-gpghome.4GuNnhUxVR/gpg.1.sh --fetch-keys https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64/3bf863cc.pub
gpg: requesting key from 'https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64/3bf863cc.pub'
gpg: key A4B469963BF863CC: "cudatools <cudatools@nvidia.com>" not changed
gpg: Total number processed: 1
gpg:              unchanged: 1
checking for perl... perl

============================================================================
== Configuring Open MPI
============================================================================

*** Startup tests
checking build system type... x86_64-unknown-linux-gnu
checking host system type... x86_64-unknown-linux-gnu
checking target system type... x86_64-unknown-linux-gnu
checking for gcc... gcc
checking whether the C compiler works... yes
checking for C compiler default output file name... a.out
checking for suffix of executables... 
checking whether we are cross compiling... no
checking for suffix of object files... o
checking whether we are using the GNU C compiler... yes
checking whether gcc accepts -g... yes
checking for gcc option to accept ISO C89... none needed
checking whether gcc understands -c and -o together... yes
checking how to run the C preprocessor... gcc -E
checking for grep that handles long lines and -e... /usr/bin/grep
checking for egrep... /usr/bin/grep -E
checking for ANSI C header files... yes
checking for sys/types.h... yes
checking for sys/stat.h... yes
checking for stdlib.h... yes
checking for string.h... yes
checking for memory.h... yes
checking for strings.h... yes
checking for inttypes.h... yes
checking for stdint.h... yes
checking for unistd.h... yes
checking minix/config.h usability... no
checking minix/config.h presence... no
checking for minix/config.h... no
checking whether it is safe to define __EXTENSIONS__... yes
checking for a BSD-compatible install... /usr/bin/install -c
checking whether build environment is sane... yes
checking for a thread-safe mkdir -p... /usr/bin/mkdir -p
checking for gawk... no
checking for mawk... mawk
checking whether make sets $(MAKE)... yes
checking for style of include used by make... GNU
checking whether make supports nested variables... yes
checking whether UID '0' is supported by ustar format... yes
checking whether GID '0' is supported by ustar format... yes
checking how to create a ustar tar archive... gnutar
checking dependency style of gcc... gcc3
checking whether make supports nested variables... (cached) yes

*** Checking versions
checking for repo version... v4.0.2
checking Open MPI version... 4.0.2
checking Open MPI release date... Oct 07, 2019
checking Open MPI repository version... v4.0.2
checking for repo version... v4.0.2
checking Open MPI Run-Time Environment version... 4.0.2
checking Open MPI Run-Time Environment release date... Oct 07, 2019
checking Open MPI Run-Time Environment repository version... v4.0.2
checking for repo version... v4.0.2
checking Open SHMEM version... 4.0.2
checking Open SHMEM release date... Oct 07, 2019
checking Open SHMEM repository version... v4.0.2
checking for repo version... v4.0.2
checking Open Portable Access Layer version... 4.0.2
checking Open Portable Access Layer release date... Oct 07, 2019
checking Open Portable Access Layer repository version... v4.0.2
checking for bootstrap Autoconf version... 2.69
checking for bootstrap Automake version... 1.15
checking for boostrap Libtool version... 2.4.6

*** Initialization, setup
configure: builddir: /scratch/build
configure: srcdir: /scratch/openmpi
configure: Detected VPATH build
installing to directory "/opt/openmpi"

*** OPAL Configuration options
checking if want to run code coverage... no
checking if want to compile with branch probabilities... no
checking if want to debug memory usage... no
checking if want to profile memory usage... no
checking if want developer-level compiler pickyness... no
checking if want developer-level debugging code... no
checking if want to developer-level timing framework... no
checking if want to install project-internal header files... no
checking if want pretty-print stacktrace... yes
checking if want pty support... yes
checking if want weak symbol support... yes
checking if want dlopen support... yes
checking for default value of mca_base_component_show_load_errors... enabled by default
checking if want heterogeneous support... no
checking if word-sized integers must be word-size aligned... no
checking if want IPv6 support... no
checking if want package/brand string... Open MPI root@82e0ab48e4bf Distribution
checking if want ident string... 4.0.2
checking if want to use an alternative checksum algo for messages... no
checking maximum length of processor name... 256
checking maximum length of error string... 256
checking maximum length of object name... 64
checking maximum length of info key... 36
checking maximum length of info val... 256
checking maximum length of port name... 1024
checking maximum length of datarep string... 128
checking if want getpwuid support... yes
checking for zlib in... (default search paths)
checking zlib.h usability... no
checking zlib.h presence... no
checking for zlib.h... no
checking will zlib support be built... no
checking __NetBSD__... no
checking __FreeBSD__... no
checking __OpenBSD__... no
checking __DragonFly__... no
checking __386BSD__... no
checking __bsdi__... no
checking __APPLE__... no
checking __linux__... yes
checking __sun__... no
checking __sun... no
checking netdb.h usability... yes
checking netdb.h presence... yes
checking for netdb.h... yes
checking netinet/in.h usability... yes
checking netinet/in.h presence... yes
checking for netinet/in.h... yes
checking netinet/tcp.h usability... yes
checking netinet/tcp.h presence... yes
checking for netinet/tcp.h... yes
checking for struct sockaddr_in... yes
checking if --with-cuda is set... found
checking for struct CUipcMemHandle_st.reserved... yes
checking whether CU_POINTER_ATTRIBUTE_SYNC_MEMOPS is declared... yes
checking whether cuPointerGetAttributes is declared... yes
checking if have cuda support... yes (-I/usr/local/cuda/include)
checking if user requested PMI support... no
checking if user requested internal PMIx support()... no
checking for pmix.h in /usr... not found
checking for pmix.h in /usr/include... not found
configure: WARNING: discovered external PMIx version is less than internal version 3.x
configure: WARNING: using internal PMIx

*** ORTE Configuration options
checking if want orterun "--prefix" behavior to be enabled by default... no

*** OMPI Configuration options
checking if want compile-time warnings inside of mpi.h... yes
checking if want sparse process groups... no
checking if want peruse support... no
checking if want Fortran MPI bindings...  (try)
checking if want C++ bindings... no
checking if want MPI::SEEK_SET support... yes
checking if want run-time MPI parameter checking... runtime

*** OSHMEM Configuration options
checking if want oshmem... yes
checking if want SGI/Quadrics compatibility mode... yes
checking if want OSHMEM API parameter checking... always
checking for on_exit... yes
checking if want pshmem... yes
checking if want to build OSHMEM fortran bindings... yes
no
checking if want custom libmpi(_FOO) name... mpi
checking if want wrapper compiler rpath support... yes
checking if want wrapper compiler runpath support... yes

============================================================================
== Compiler and preprocessor tests
============================================================================

*** C compiler and preprocessor
checking for gcc... (cached) gcc
checking whether we are using the GNU C compiler... (cached) yes
checking whether gcc accepts -g... (cached) yes
checking for gcc option to accept ISO C89... (cached) none needed
checking whether gcc understands -c and -o together... (cached) yes
checking if gcc requires a flag for C11... no
configure: verifying gcc supports C11 without a flag
checking if gcc  supports C11 _Thread_local... yes
checking if gcc  supports C11 atomic variables... yes
checking if gcc  supports C11 _Atomic keyword... yes
checking if gcc  supports C11 _Generic keyword... yes
checking if gcc  supports C11 _Static_assert... yes
checking if gcc  supports C11 atomic_fetch_xor_explicit...  ---> Removed intermediate container d4cfdb93286e
 ---> 9e19caf8cff7
Step 5/18 : RUN apt-get update && DEBIAN_FRONTEND=noninteractive apt-get install -yq         bc         wget         ccache         ninja-build         python3         git         vim         jq         libgtest-dev         libopenmpi-dev         &&     apt-get clean &&     rm -rf /var/lib/apt/lists/*
yes
configure: no flag required for C11 support
checking if gcc  supports __thread... yes
checking if gcc  supports C11 _Thread_local... yes
checking for the C compiler vendor...  ---> Running in 56d9a27141e1
gnu
checking for ANSI C header files... (cached) yes
checking if gcc supports -finline-functions... yes
checking if gcc supports -fno-strict-aliasing... yes
configure: WARNING:  -fno-strict-aliasing has been added to CFLAGS
checking if gcc supports __builtin_expect... yes
checking if gcc supports __builtin_prefetch... yes
checking if gcc supports __builtin_clz... yes
checking for C optimization flags... -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing
checking for Interix environment... no
checking for C ident string support... #ident
checking for int8_t... yes
checking for uint8_t... yes
checking for int16_t... yes
checking for uint16_t... yes
checking for int32_t... yes
checking for uint32_t... yes
checking for int64_t... yes
checking for uint64_t... yes
checking for int128_t... no
checking for __int128... yes
checking for uint128_t... no
checking for long long... yes
checking for __float128... yes
checking for long double... yes
checking complex.h usability... yes
checking complex.h presence... yes
checking for complex.h... yes
checking for float _Complex... yes
checking for double _Complex... yes
checking for long double _Complex... yes
checking for intptr_t... yes
checking for uintptr_t... yes
checking for mode_t... yes
checking for ssize_t... yes
checking for ptrdiff_t... yes
checking size of _Bool... 1
checking size of char... 1
checking size of short... 2
checking size of int... 4
checking size of long... Get:1 http://security.ubuntu.com/ubuntu focal-security InRelease [114 kB]
Get:2 https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64  InRelease [1581 B]
Get:3 http://archive.ubuntu.com/ubuntu focal InRelease [265 kB]
8
checking size of long long... 8
checking size of float... 4
checking size of double... Get:4 http://archive.ubuntu.com/ubuntu focal-updates InRelease [114 kB]
Get:5 http://archive.ubuntu.com/ubuntu focal-backports InRelease [108 kB]
8
checking size of long double... 16
checking size of __float128... 16
checking size of float _Complex... Get:6 https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64  Packages [1461 kB]
8
checking size of double _Complex... 16
checking size of long double _Complex... 32
checking size of void *... 8
checking size of size_t... 8
checking size of ssize_t... Get:7 http://security.ubuntu.com/ubuntu focal-security/multiverse amd64 Packages [29.7 kB]
Get:8 http://security.ubuntu.com/ubuntu focal-security/restricted amd64 Packages [3424 kB]
8
checking size of ptrdiff_t... 8
checking size of wchar_t... 4
checking size of pid_t... 4
checking size of atomic_short... 2
checking size of atomic_int... 4
checking size of atomic_long... Get:9 http://security.ubuntu.com/ubuntu focal-security/universe amd64 Packages [1194 kB]
Get:10 http://security.ubuntu.com/ubuntu focal-security/main amd64 Packages [3492 kB]
8
checking size of atomic_llong... 8
checking alignment of bool... 1
checking alignment of int8_t... Get:11 http://archive.ubuntu.com/ubuntu focal/multiverse amd64 Packages [177 kB]
Get:12 http://archive.ubuntu.com/ubuntu focal/main amd64 Packages [1275 kB]
1
checking alignment of int16_t... 2
checking alignment of int32_t... Get:13 http://archive.ubuntu.com/ubuntu focal/universe amd64 Packages [11.3 MB]
4
checking alignment of int64_t... Get:14 http://archive.ubuntu.com/ubuntu focal/restricted amd64 Packages [33.4 kB]
8
checking alignment of char... 1
checking alignment of short... 2
checking alignment of wchar_t... Get:15 http://archive.ubuntu.com/ubuntu focal-updates/multiverse amd64 Packages [32.4 kB]
Get:16 http://archive.ubuntu.com/ubuntu focal-updates/universe amd64 Packages [1489 kB]
4
checking alignment of int... Get:17 http://archive.ubuntu.com/ubuntu focal-updates/restricted amd64 Packages [3574 kB]
Get:18 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 Packages [3967 kB]
4
checking alignment of long... 8
checking alignment of long long... 8
checking alignment of float... 4
checking alignment of double... 8
checking alignment of long double... 16
checking alignment of __float128... 16
checking alignment of float _Complex... 4
checking alignment of double _Complex... Get:19 http://archive.ubuntu.com/ubuntu focal-backports/universe amd64 Packages [28.6 kB]
Get:20 http://archive.ubuntu.com/ubuntu focal-backports/main amd64 Packages [55.2 kB]
8
checking alignment of long double _Complex... 16
checking alignment of void *... 8
checking alignment of size_t... 8
checking for weak symbol support... yes
checking for macro weak symbol support... yes
checking for functional offsetof macro... yes

*** C++ compiler and preprocessor
checking for g++... g++
checking whether we are using the GNU C++ compiler... yes
checking whether g++ accepts -g... yes
checking dependency style of g++... gcc3
checking how to run the C++ preprocessor... g++ -E
checking for the C++ compiler vendor... Fetched 32.2 MB in 6s (5268 kB/s)
Reading package lists...gnu
checking if g++ supports -finline-functions... yes
configure: WARNING:  -finline-functions has been added to CXXFLAGS
checking if C and C++ are link compatible... yes
checking for C++ optimization flags... -O3 -DNDEBUG -finline-functions
checking size of bool... 1
checking alignment of bool... (cached) 1

*** C++ compiler and preprocessor
checking whether we are using the GNU C++ compiler... (cached) yes
checking whether g++ accepts -g... (cached) yes
checking dependency style of g++... (cached) gcc3
checking how to run the C++ preprocessor... g++ -E
checking if C++ compiler works... yes
checking if g++ supports -finline-functions... yes
configure: WARNING:  -finline-functions has been added to CXXFLAGS
checking if C and C++ are link compatible... (cached) yes
checking for C++ optimization flags... -O3 -DNDEBUG -finline-functions
checking size of bool... (cached) 1
checking alignment of bool... (cached) 1
checking if able to build the MPI C++ bindings... no
checking if want C++ exception handling... skipped

*** Compiler characteristics
checking for __attribute__... yes
checking for __attribute__(aligned)... yes
checking for __attribute__(always_inline)... yes
checking for __attribute__(cold)... yes
checking for __attribute__(const)... yes
checking for __attribute__(deprecated)... yes
checking for __attribute__(deprecated_argument)... yes
checking for __attribute__(error)... yes
checking for __attribute__(format)... no
checking for __attribute__(format_funcptr)... 
no
checking for __attribute__(hot)... yes
checking for __attribute__(malloc)... yes
checking for __attribute__(may_alias)... yes
checking for __attribute__(no_instrument_function)... Reading package lists...yes
checking for __attribute__(noinline)... yes
checking for __attribute__(nonnull)... no
checking for __attribute__(noreturn)... yes
checking for __attribute__(noreturn_funcptr)... yes
checking for __attribute__(packed)... yes
checking for __attribute__(pure)... yes
checking for __attribute__(sentinel)... no
checking for __attribute__(unused)... yes
checking for __attribute__(visibility)... yes
checking for __attribute__(warn_unused_result)... no
checking for __attribute__(weak_alias)... yes
checking for __attribute__(destructor)... yes
checking for __attribute__(optnone)... no
checking for __attribute__(extension)... yes
checking for compiler familyid... 1
checking for compiler familyname... GNU
checking for compiler version... 590848
checking for compiler version_str... 
Building dependency tree...9.4.0

*** Java MPI bindings
checking if want Java bindings... no

*** OpenSHMEM profiling
checking if pshmem will be enabled... yes (weak symbols supported)

*** Assembler
checking dependency style of gcc... gcc3
checking for BSD- or MS-compatible name lister (nm)... /usr/bin/nm -B
checking the name lister (/usr/bin/nm -B) interface... BSD nm
checking for fgrep... /usr/bin/grep -F
checking for __atomic builtin atomics... yes
checking for __atomic_compare_exchange_n... no
checking for __atomic_compare_exchange_n with -mcx16... no
checking for __atomic_compare_exchange_n with -latomic... yes
checking if __atomic_compare_exchange_n() gives correct results... 
Reading state information...
yes
checking if __int128 atomic compare-and-swap is always lock-free... no
checking for __sync_bool_compare_and_swap... no
checking for __sync_bool_compare_and_swap with -mcx16... yes
checking if __sync_bool_compare_and_swap() gives correct results... yes
checking if .proc/endp is needed... no
checking directive for setting text section... .text
checking directive for exporting symbols... .globl
checking for objdump... objdump
checking if .note.GNU-stack is needed... yes
checking suffix for labels... :
checking prefix for global symbol labels... The following additional packages will be installed:
  autoconf automake autotools-dev cpp-8 file gcc-8 gcc-8-base gfortran
  gfortran-8 gfortran-9 git-man googletest ibverbs-providers krb5-locales less
  libbrotli1 libcaf-openmpi-3 libcanberra0 libcbor0.6 libcoarrays-dev
  libcoarrays-openmpi-dev libcurl3-gnutls liberror-perl libevent-2.1-7
  libevent-core-2.1-7 libevent-dev libevent-extra-2.1-7 libevent-openssl-2.1-7
  libevent-pthreads-2.1-7 libfabric1 libfido2-1 libgcc-8-dev libgfortran-8-dev
  libgfortran-9-dev libgfortran5 libgpm2 libgssapi-krb5-2 libhwloc-dev
  libhwloc-plugins libhwloc15 libibverbs-dev libibverbs1 libicu66 libjq1
  libk5crypto3 libkeyutils1 libkrb5-3 libkrb5support0 libltdl-dev libltdl7
  libmagic-mgc libmagic1 libmpdec2 libmpx2 libnghttp2-14 libnl-3-200
  libnl-3-dev libnl-route-3-200 libnl-route-3-dev libnuma-dev libnuma1
  libonig5 libopenmpi3 libpmix2 libpsl5 libpsm-infinipath1 libpsm2-2
  libpython3-stdlib libpython3.8 libpython3.8-minimal libpython3.8-stdlib
  librdmacm1 librtmp1 libsigsegv2 libssh-4 libtdb1 libtool libvorbisfile3
  libxml2 libxmuu1 libxnvctrl0 m4 mime-support ocl-icd-libopencl1 openmpi-bin
  openmpi-common openssh-client publicsuffix python3-minimal python3.8
  python3.8-minimal sound-theme-freedesktop tzdata vim-common vim-runtime
  xauth xxd
Suggested packages:
  autoconf-archive gnu-standards autoconf-doc gettext distcc | icecc
  gcc-8-locales gcc-8-multilib gcc-8-doc gfortran-multilib gfortran-doc
  gfortran-8-multilib gfortran-8-doc gfortran-9-multilib gfortran-9-doc
  gettext-base git-daemon-run | git-daemon-sysvinit git-doc git-el git-email
  git-gui gitk gitweb git-cvs git-mediawiki git-svn libcanberra-gtk0
  libcanberra-pulse gpm krb5-doc krb5-user libhwloc-contrib-plugins
  libtool-doc openmpi-doc gcj-jdk m4-doc opencl-icd keychain libpam-ssh
  monkeysphere ssh-askpass python3-doc python3-tk python3-venv python3.8-venv
  python3.8-doc binfmt-support ctags vim-doc vim-scripts

checking prefix for lsym labels... .L
checking prefix for function in .type... @
checking if .size is needed... yes
checking if .align directive takes logarithmic value... no
checking for cmpxchg16b... yes
checking if cmpxchg16b() gives correct results... yes
checking if cmpxchg16b_result works... yes
checking for assembly architecture... X86_64
checking for builtin atomics... BUILTIN_GCC

*** Fortran compiler
checking for gfortran... no
checking for f95... no
checking for fort... no
checking for xlf95... no
checking for ifort... no
checking for ifc... no
checking for efc... no
checking for pgfortran... no
checking for pgf95... no
checking for lf95... no
checking for f90... no
checking for xlf90... no
checking for pgf90... no
checking for epcf90... no
checking for nagfor... no
checking whether we are using the GNU Fortran compiler... no
checking whether  accepts -g... no
checking whether ln -s works... yes
configure: WARNING: *** All Fortran MPI bindings disabled (could not find compiler)
checking for  warnings flags... none
checking to see if mpifort compiler needs additional linker flags... none
checking if Fortran compiler supports CHARACTER... no
checking if Fortran compiler supports LOGICAL... no
checking if Fortran compiler supports LOGICAL*1... no
checking if Fortran compiler supports LOGICAL*2... no
checking if Fortran compiler supports LOGICAL*4... no
checking if Fortran compiler supports LOGICAL*8... no
checking if Fortran compiler supports INTEGER... no
checking if Fortran compiler supports INTEGER*1... no
checking if Fortran compiler supports INTEGER*2... no
checking if Fortran compiler supports INTEGER*4... no
The following NEW packages will be installed:
  autoconf automake autotools-dev bc ccache cpp-8 file gcc-8 gcc-8-base
  gfortran gfortran-8 gfortran-9 git git-man googletest ibverbs-providers jq
  krb5-locales less libbrotli1 libcaf-openmpi-3 libcanberra0 libcbor0.6
  libcoarrays-dev libcoarrays-openmpi-dev libcurl3-gnutls liberror-perl
  libevent-2.1-7 libevent-core-2.1-7 libevent-dev libevent-extra-2.1-7
  libevent-openssl-2.1-7 libevent-pthreads-2.1-7 libfabric1 libfido2-1
  libgcc-8-dev libgfortran-8-dev libgfortran-9-dev libgfortran5 libgpm2
  libgssapi-krb5-2 libgtest-dev libhwloc-dev libhwloc-plugins libhwloc15
  libibverbs-dev libibverbs1 libicu66 libjq1 libk5crypto3 libkeyutils1
  libkrb5-3 libkrb5support0 libltdl-dev libltdl7 libmagic-mgc libmagic1
  libmpdec2 libmpx2 libnghttp2-14 libnl-3-200 libnl-3-dev libnl-route-3-200
  libnl-route-3-dev libnuma-dev libnuma1 libonig5 libopenmpi-dev libopenmpi3
  libpmix2 libpsl5 libpsm-infinipath1 libpsm2-2 libpython3-stdlib libpython3.8
  libpython3.8-minimal libpython3.8-stdlib librdmacm1 librtmp1 libsigsegv2
  libssh-4 libtdb1 libtool libvorbisfile3 libxml2 libxmuu1 libxnvctrl0 m4
  mime-support ninja-build ocl-icd-libopencl1 openmpi-bin openmpi-common
  openssh-client publicsuffix python3 python3-minimal python3.8
  python3.8-minimal sound-theme-freedesktop tzdata vim vim-common vim-runtime
  wget xauth xxd
checking if Fortran compiler supports INTEGER*8... no
checking if Fortran compiler supports INTEGER*16... no
checking if Fortran compiler supports REAL... no
checking if Fortran compiler supports REAL*2... no
checking if Fortran compiler supports REAL*4... no
checking if Fortran compiler supports REAL*8... no
checking if Fortran compiler supports REAL*16... no
checking for C type matching bit representation of REAL*16... skipped (no REAL*16)
configure: WARNING: MPI_REAL16 and MPI_COMPLEX32 support have been disabled
checking if Fortran compiler supports DOUBLE PRECISION... 0 upgraded, 107 newly installed, 0 to remove and 38 not upgraded.
Need to get 83.7 MB of archives.
After this operation, 353 MB of additional disk space will be used.
Get:1 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 libpython3.8-minimal amd64 3.8.10-0ubuntu1~20.04.9 [718 kB]
Get:2 https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64  libxnvctrl0 550.54.15-0ubuntu1 [21.4 kB]
no
checking if Fortran compiler supports COMPLEX... no
checking if Fortran compiler supports COMPLEX*4... no
checking if Fortran compiler supports COMPLEX*8... no
checking if Fortran compiler supports COMPLEX*16... no
checking if Fortran compiler supports COMPLEX*32... no
checking if Fortran compiler supports DOUBLE COMPLEX... no
checking for max Fortran MPI handle index... 2147483647
checking Fortran value for .TRUE. logical type... 77
checking for correct handling of Fortran logical arrays... skipped
checking max supported Fortran array rank... 0
checking for the value of MPI_STATUS_SIZE... 6 Fortran INTEGERs
checking KIND value of Fortran C_INT16_T... skipped
checking KIND value of Fortran C_INT32_T... skipped
checking KIND value of Fortran C_INT64_T... skipped
checking if building Fortran mpif.h bindings... no
checking if building Fortran 'use mpi' bindings... no
checking if building Fortran 'use mpi_f08' bindings... no

============================================================================
== Header file tests
============================================================================
checking alloca.h usability... Get:3 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 python3.8-minimal amd64 3.8.10-0ubuntu1~20.04.9 [1890 kB]
yes
checking alloca.h presence... yes
checking for alloca.h... yes
checking aio.h usability... yes
checking aio.h presence... yes
checking for aio.h... yes
checking arpa/inet.h usability... yes
checking arpa/inet.h presence... yes
checking for arpa/inet.h... yes
checking dirent.h usability... yes
checking dirent.h presence... yes
checking for dirent.h... yes
checking dlfcn.h usability... yes
checking dlfcn.h presence... yes
checking for dlfcn.h... yes
checking endian.h usability... yes
checking endian.h presence... yes
checking for endian.h... yes
checking execinfo.h usability... yes
checking execinfo.h presence... Get:4 http://archive.ubuntu.com/ubuntu focal/main amd64 python3-minimal amd64 3.8.2-0ubuntu2 [23.6 kB]
Get:5 http://archive.ubuntu.com/ubuntu focal/main amd64 mime-support all 3.64ubuntu1 [30.6 kB]
Get:6 http://archive.ubuntu.com/ubuntu focal/main amd64 libmpdec2 amd64 2.4.2-3 [81.1 kB]
Get:7 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 libpython3.8-stdlib amd64 3.8.10-0ubuntu1~20.04.9 [1674 kB]
Get:8 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 python3.8 amd64 3.8.10-0ubuntu1~20.04.9 [387 kB]
Get:9 http://archive.ubuntu.com/ubuntu focal/main amd64 libpython3-stdlib amd64 3.8.2-0ubuntu2 [7068 B]
Get:10 http://archive.ubuntu.com/ubuntu focal/main amd64 python3 amd64 3.8.2-0ubuntu2 [47.6 kB]
Get:11 http://archive.ubuntu.com/ubuntu focal/main amd64 libmagic-mgc amd64 1:5.38-4 [218 kB]
yes
checking for execinfo.h... yes
checking err.h usability... yes
checking err.h presence... yes
checking for err.h... yes
checking fcntl.h usability... yes
checking fcntl.h presence... yes
checking for fcntl.h... yes
checking grp.h usability... yes
checking grp.h presence... yes
checking for grp.h... yes
Get:12 http://archive.ubuntu.com/ubuntu focal/main amd64 libmagic1 amd64 1:5.38-4 [75.9 kB]
Get:13 http://archive.ubuntu.com/ubuntu focal/main amd64 file amd64 1:5.38-4 [23.3 kB]
Get:14 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 less amd64 551-1ubuntu0.2 [123 kB]
Get:15 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 tzdata all 2024a-0ubuntu0.20.04 [301 kB]
Get:16 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 libicu66 amd64 66.1-2ubuntu2.1 [8515 kB]
checking libgen.h usability... yes
checking libgen.h presence... yes
checking for libgen.h... yes
checking libutil.h usability... no
checking libutil.h presence... no
checking for libutil.h... no
checking for memory.h... (cached) yes
checking for netdb.h... (cached) yes
checking for netinet/in.h... (cached) yes
checking for netinet/tcp.h... (cached) yes
checking poll.h usability... yes
checking poll.h presence... yes
checking for poll.h... yes
checking pthread.h usability... yes
checking pthread.h presence... yes
checking for pthread.h... yes
checking pty.h usability... yes
checking pty.h presence... yes
checking for pty.h... yes
checking pwd.h usability... yes
checking pwd.h presence... yes
checking for pwd.h... yes
checking sched.h usability... Get:17 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 libxml2 amd64 2.9.10+dfsg-5ubuntu0.20.04.7 [640 kB]
Get:18 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 xxd amd64 2:8.1.2269-1ubuntu5.22 [53.2 kB]
Get:19 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 vim-common all 2:8.1.2269-1ubuntu5.22 [88.2 kB]
Get:20 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 krb5-locales all 1.17-6ubuntu4.4 [11.5 kB]
Get:21 http://archive.ubuntu.com/ubuntu focal/main amd64 libcbor0.6 amd64 0.6.0-0ubuntu1 [21.1 kB]
Get:22 http://archive.ubuntu.com/ubuntu focal/main amd64 libfido2-1 amd64 1.3.1-1ubuntu2 [47.9 kB]
Get:23 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 libkrb5support0 amd64 1.17-6ubuntu4.4 [31.0 kB]
Get:24 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 libk5crypto3 amd64 1.17-6ubuntu4.4 [79.9 kB]
Get:25 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 libkeyutils1 amd64 1.6-6ubuntu1.1 [10.3 kB]
Get:26 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 libkrb5-3 amd64 1.17-6ubuntu4.4 [330 kB]
Get:27 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 libgssapi-krb5-2 amd64 1.17-6ubuntu4.4 [121 kB]
Get:28 http://archive.ubuntu.com/ubuntu focal/main amd64 libnuma1 amd64 2.0.12-1 [20.8 kB]
Get:29 http://archive.ubuntu.com/ubuntu focal/main amd64 libpsl5 amd64 0.21.0-1ubuntu1 [51.5 kB]
Get:30 http://archive.ubuntu.com/ubuntu focal/main amd64 libxmuu1 amd64 2:1.1.3-0ubuntu1 [9728 B]
Get:31 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 openssh-client amd64 1:8.2p1-4ubuntu0.11 [670 kB]
Get:32 http://archive.ubuntu.com/ubuntu focal/main amd64 publicsuffix all 20200303.0012-1 [111 kB]
Get:33 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 wget amd64 1.20.3-1ubuntu2 [348 kB]
Get:34 http://archive.ubuntu.com/ubuntu focal/main amd64 xauth amd64 1:1.1-0ubuntu1 [25.0 kB]
Get:35 http://archive.ubuntu.com/ubuntu focal/main amd64 libsigsegv2 amd64 2.12-2 [13.9 kB]
Get:36 http://archive.ubuntu.com/ubuntu focal/main amd64 m4 amd64 1.4.18-4 [199 kB]
Get:37 http://archive.ubuntu.com/ubuntu focal/main amd64 autoconf all 2.69-11.1 [321 kB]
Get:38 http://archive.ubuntu.com/ubuntu focal/main amd64 autotools-dev all 20180224.1 [39.6 kB]
Get:39 http://archive.ubuntu.com/ubuntu focal/main amd64 automake all 1:1.16.1-4ubuntu6 [522 kB]
Get:40 http://archive.ubuntu.com/ubuntu focal/main amd64 bc amd64 1.07.1-2build1 [86.3 kB]
Get:41 http://archive.ubuntu.com/ubuntu focal/main amd64 ccache amd64 3.7.7-1 [121 kB]
Get:42 http://archive.ubuntu.com/ubuntu focal/universe amd64 gcc-8-base amd64 8.4.0-3ubuntu2 [18.7 kB]
Get:43 http://archive.ubuntu.com/ubuntu focal/universe amd64 cpp-8 amd64 8.4.0-3ubuntu2 [8945 kB]
yes
checking sched.h presence... yes
checking for sched.h... yes
checking for strings.h... (cached) yes
checking stropts.h usability... no
checking stropts.h presence... no
checking for stropts.h... no
checking linux/ethtool.h usability... yes
checking linux/ethtool.h presence... yes
checking for linux/ethtool.h... yes
checking linux/sockios.h usability... Get:44 http://archive.ubuntu.com/ubuntu focal/universe amd64 libmpx2 amd64 8.4.0-3ubuntu2 [11.8 kB]
Get:45 http://archive.ubuntu.com/ubuntu focal/universe amd64 libgcc-8-dev amd64 8.4.0-3ubuntu2 [2313 kB]
yes
checking linux/sockios.h presence... yes
checking for linux/sockios.h... yes
checking sys/fcntl.h usability... yes
checking sys/fcntl.h presence... yes
checking for sys/fcntl.h... yes
checking sys/ipc.h usability... yes
checking sys/ipc.h presence... yes
checking for sys/ipc.h... yes
checking sys/shm.h usability... yes
checking sys/shm.h presence... Get:46 http://archive.ubuntu.com/ubuntu focal/universe amd64 gcc-8 amd64 8.4.0-3ubuntu2 [9833 kB]
yes
checking for sys/shm.h... yes
checking sys/ioctl.h usability... yes
checking sys/ioctl.h presence... yes
checking for sys/ioctl.h... yes
checking sys/mman.h usability... yes
checking sys/mman.h presence... yes
checking for sys/mman.h... yes
checking sys/param.h usability... yes
checking sys/param.h presence... Get:47 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 libgfortran5 amd64 10.5.0-1ubuntu1~20.04 [737 kB]
Get:48 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 libgfortran-9-dev amd64 9.4.0-1ubuntu1~20.04.2 [685 kB]
Get:49 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 gfortran-9 amd64 9.4.0-1ubuntu1~20.04.2 [7936 kB]
yes
checking for sys/param.h... yes
checking sys/queue.h usability... yes
checking sys/queue.h presence... yes
checking for sys/queue.h... yes
checking sys/resource.h usability... yes
checking sys/resource.h presence... yes
checking for sys/resource.h... yes
checking sys/select.h usability... yes
checking sys/select.h presence... yes
checking for sys/select.h... yes
checking sys/socket.h usability... Get:50 http://archive.ubuntu.com/ubuntu focal/main amd64 gfortran amd64 4:9.3.0-1ubuntu2 [1372 B]
Get:51 http://archive.ubuntu.com/ubuntu focal/universe amd64 libgfortran-8-dev amd64 8.4.0-3ubuntu2 [625 kB]
Get:52 http://archive.ubuntu.com/ubuntu focal/universe amd64 gfortran-8 amd64 8.4.0-3ubuntu2 [9424 kB]
yes
checking sys/socket.h presence... yes
checking for sys/socket.h... yes
checking sys/sockio.h usability... no
checking sys/sockio.h presence... no
checking for sys/sockio.h... no
checking for sys/stat.h... (cached) yes
checking sys/statfs.h usability... yes
checking sys/statfs.h presence... yes
checking for sys/statfs.h... yes
checking sys/statvfs.h usability... Get:53 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 libbrotli1 amd64 1.0.7-6ubuntu0.1 [267 kB]
Get:54 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 libnghttp2-14 amd64 1.40.0-1ubuntu0.2 [79.4 kB]
Get:55 http://archive.ubuntu.com/ubuntu focal/main amd64 librtmp1 amd64 2.4+20151223.gitfa8646d.1-2build1 [54.9 kB]
Get:56 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 libssh-4 amd64 0.9.3-2ubuntu2.5 [171 kB]
Get:57 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 libcurl3-gnutls amd64 7.68.0-1ubuntu2.21 [232 kB]
Get:58 http://archive.ubuntu.com/ubuntu focal/main amd64 liberror-perl all 0.17029-1 [26.5 kB]
Get:59 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 git-man all 1:2.25.1-1ubuntu3.11 [887 kB]
Get:60 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 git amd64 1:2.25.1-1ubuntu3.11 [4605 kB]
yes
checking sys/statvfs.h presence... yes
checking for sys/statvfs.h... yes
checking sys/time.h usability... yes
checking sys/time.h presence... yes
checking for sys/time.h... yes
checking sys/tree.h usability... no
checking sys/tree.h presence... no
checking for sys/tree.h... no
checking for sys/types.h... (cached) yes
checking sys/uio.h usability... Get:61 http://archive.ubuntu.com/ubuntu focal/universe amd64 googletest all 1.10.0-2 [623 kB]
Get:62 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 libnl-3-200 amd64 3.4.0-1ubuntu0.1 [54.4 kB]
Get:63 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 libnl-route-3-200 amd64 3.4.0-1ubuntu0.1 [151 kB]
Get:64 http://archive.ubuntu.com/ubuntu focal/main amd64 libibverbs1 amd64 28.0-1ubuntu1 [53.6 kB]
Get:65 http://archive.ubuntu.com/ubuntu focal/main amd64 ibverbs-providers amd64 28.0-1ubuntu1 [232 kB]
Get:66 http://archive.ubuntu.com/ubuntu focal/universe amd64 libonig5 amd64 6.9.4-1 [142 kB]
Get:67 http://archive.ubuntu.com/ubuntu focal-updates/universe amd64 libjq1 amd64 1.6-1ubuntu0.20.04.1 [121 kB]
Get:68 http://archive.ubuntu.com/ubuntu focal-updates/universe amd64 jq amd64 1.6-1ubuntu0.20.04.1 [50.2 kB]
Get:69 http://archive.ubuntu.com/ubuntu focal/main amd64 libevent-2.1-7 amd64 2.1.11-stable-1 [138 kB]
Get:70 http://archive.ubuntu.com/ubuntu focal/main amd64 libevent-core-2.1-7 amd64 2.1.11-stable-1 [89.1 kB]
Get:71 http://archive.ubuntu.com/ubuntu focal/main amd64 libevent-pthreads-2.1-7 amd64 2.1.11-stable-1 [7372 B]
Get:72 http://archive.ubuntu.com/ubuntu focal/universe amd64 libpsm-infinipath1 amd64 3.3+20.604758e7-6 [168 kB]
Get:73 http://archive.ubuntu.com/ubuntu focal/universe amd64 libpsm2-2 amd64 11.2.86-1 [178 kB]
Get:74 http://archive.ubuntu.com/ubuntu focal/main amd64 librdmacm1 amd64 28.0-1ubuntu1 [64.9 kB]
Get:75 http://archive.ubuntu.com/ubuntu focal-updates/universe amd64 libfabric1 amd64 1.6.2-3ubuntu0.1 [396 kB]
Get:76 http://archive.ubuntu.com/ubuntu focal/main amd64 libltdl7 amd64 2.4.6-14 [38.5 kB]
yes
checking sys/uio.h presence... yes
checking for sys/uio.h... yes
checking sys/un.h usability... yes
checking sys/un.h presence... yes
checking for sys/un.h... yes
checking net/uio.h usability... no
checking net/uio.h presence... no
checking for net/uio.h... no
checking sys/utsname.h usability... Get:77 http://archive.ubuntu.com/ubuntu focal/universe amd64 libhwloc15 amd64 2.1.0+dfsg-4 [134 kB]
Get:78 http://archive.ubuntu.com/ubuntu focal/main amd64 ocl-icd-libopencl1 amd64 2.2.11-1ubuntu1 [30.3 kB]
Get:79 http://archive.ubuntu.com/ubuntu focal/universe amd64 libhwloc-plugins amd64 2.1.0+dfsg-4 [14.4 kB]
Get:80 http://archive.ubuntu.com/ubuntu focal/universe amd64 libpmix2 amd64 3.1.5-1 [442 kB]
Get:81 http://archive.ubuntu.com/ubuntu focal/universe amd64 libopenmpi3 amd64 4.0.3-0ubuntu1 [1978 kB]
Get:82 http://archive.ubuntu.com/ubuntu focal/universe amd64 libcaf-openmpi-3 amd64 2.8.0-1 [35.5 kB]
Get:83 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 libtdb1 amd64 1.4.5-0ubuntu0.20.04.1 [44.2 kB]
Get:84 http://archive.ubuntu.com/ubuntu focal/main amd64 libvorbisfile3 amd64 1.3.6-2ubuntu1 [16.1 kB]
Get:85 http://archive.ubuntu.com/ubuntu focal/main amd64 sound-theme-freedesktop all 0.8-2ubuntu1 [384 kB]
Get:86 http://archive.ubuntu.com/ubuntu focal/main amd64 libcanberra0 amd64 0.30-7ubuntu1 [38.1 kB]
Get:87 http://archive.ubuntu.com/ubuntu focal/universe amd64 libcoarrays-dev amd64 2.8.0-1 [28.2 kB]
Get:88 http://archive.ubuntu.com/ubuntu focal/universe amd64 openmpi-common all 4.0.3-0ubuntu1 [151 kB]
Get:89 http://archive.ubuntu.com/ubuntu focal/universe amd64 openmpi-bin amd64 4.0.3-0ubuntu1 [67.4 kB]
Get:90 http://archive.ubuntu.com/ubuntu focal/universe amd64 libcoarrays-openmpi-dev amd64 2.8.0-1 [34.2 kB]
Get:91 http://archive.ubuntu.com/ubuntu focal/main amd64 libevent-extra-2.1-7 amd64 2.1.11-stable-1 [60.0 kB]
Get:92 http://archive.ubuntu.com/ubuntu focal/main amd64 libevent-openssl-2.1-7 amd64 2.1.11-stable-1 [14.3 kB]
Get:93 http://archive.ubuntu.com/ubuntu focal/main amd64 libevent-dev amd64 2.1.11-stable-1 [261 kB]
Get:94 http://archive.ubuntu.com/ubuntu focal/main amd64 libgpm2 amd64 1.20.7-5 [15.1 kB]
Get:95 http://archive.ubuntu.com/ubuntu focal/universe amd64 libgtest-dev amd64 1.10.0-2 [1583 kB]
yes
checking sys/utsname.h presence... yes
checking for sys/utsname.h... yes
checking sys/vfs.h usability... yes
checking sys/vfs.h presence... yes
checking for sys/vfs.h... yes
checking sys/wait.h usability... yes
checking sys/wait.h presence... Get:96 http://archive.ubuntu.com/ubuntu focal/main amd64 libltdl-dev amd64 2.4.6-14 [162 kB]
Get:97 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 libnl-3-dev amd64 3.4.0-1ubuntu0.1 [92.9 kB]
Get:98 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 libnl-route-3-dev amd64 3.4.0-1ubuntu0.1 [167 kB]
Get:99 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 libpython3.8 amd64 3.8.10-0ubuntu1~20.04.9 [1625 kB]
Get:100 http://archive.ubuntu.com/ubuntu focal/main amd64 libtool all 2.4.6-14 [161 kB]
Get:101 http://archive.ubuntu.com/ubuntu focal/universe amd64 ninja-build amd64 1.10.0-1build1 [107 kB]
Get:102 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 vim-runtime all 2:8.1.2269-1ubuntu5.22 [5877 kB]
yes
checking for sys/wait.h... yes
checking syslog.h usability... yes
checking syslog.h presence... yes
checking for syslog.h... yes
checking termios.h usability... yes
checking termios.h presence... yes
checking for termios.h... yes
checking ulimit.h usability... yes
checking ulimit.h presence... Get:103 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 vim amd64 2:8.1.2269-1ubuntu5.22 [1243 kB]
yes
checking for ulimit.h... yes
checking for unistd.h... (cached) yes
checking util.h usability... no
checking util.h presence... no
checking for util.h... no
checking utmp.h usability... yes
checking utmp.h presence... yes
checking for utmp.h... yes
checking malloc.h usability... Get:104 http://archive.ubuntu.com/ubuntu focal/main amd64 libnuma-dev amd64 2.0.12-1 [32.4 kB]
Get:105 http://archive.ubuntu.com/ubuntu focal/universe amd64 libhwloc-dev amd64 2.1.0+dfsg-4 [205 kB]
Get:106 http://archive.ubuntu.com/ubuntu focal/main amd64 libibverbs-dev amd64 28.0-1ubuntu1 [444 kB]
Get:107 http://archive.ubuntu.com/ubuntu focal/universe amd64 libopenmpi-dev amd64 4.0.3-0ubuntu1 [798 kB]
yes
checking malloc.h presence... yes
checking for malloc.h... yes
checking ifaddrs.h usability... yes
checking ifaddrs.h presence... yes
checking for ifaddrs.h... yes
checking crt_externs.h usability... no
checking crt_externs.h presence... no
checking for crt_externs.h... no
checking regex.h usability... yes
checking regex.h presence... yes
checking for regex.h... yes
checking mntent.h usability... yes
checking mntent.h presence... yes
checking for mntent.h... yes
checking paths.h usability... yes
checking paths.h presence... debconf: delaying package configuration, since apt-utils is not installed
yes
checking for paths.h... yes
checking ioLib.h usability... no
checking ioLib.h presence... no
checking for ioLib.h... no
checking sockLib.h usability... no
checking sockLib.h presence... no
checking for sockLib.h... no
checking hostLib.h usability... Fetched 83.7 MB in 6s (14.9 MB/s)
Selecting previously unselected package libpython3.8-minimal:amd64.
(Reading database ... 
(Reading database ... 5%
(Reading database ... 10%
no
checking hostLib.h presence... no
checking for hostLib.h... no
checking shlwapi.h usability... no
checking shlwapi.h presence... no
checking for shlwapi.h... no
checking sys/synch.h usability... no
checking sys/synch.h presence... no
checking for sys/synch.h... no
checking db.h usability... (Reading database ... 15%
(Reading database ... 20%
(Reading database ... 25%
(Reading database ... 30%
(Reading database ... 35%
(Reading database ... 40%
(Reading database ... 45%
(Reading database ... 50%
(Reading database ... 55%
(Reading database ... 60%
(Reading database ... 65%
(Reading database ... 70%
(Reading database ... 75%
(Reading database ... 80%
(Reading database ... 85%
(Reading database ... 90%
(Reading database ... 95%
(Reading database ... 100%
(Reading database ... 14332 files and directories currently installed.)
Preparing to unpack .../libpython3.8-minimal_3.8.10-0ubuntu1~20.04.9_amd64.deb ...
Unpacking libpython3.8-minimal:amd64 (3.8.10-0ubuntu1~20.04.9) ...
no
checking db.h presence... no
checking for db.h... no
checking ndbm.h usability... no
checking ndbm.h presence... no
checking for ndbm.h... no
checking for zlib.h... (cached) no
checking ieee754.h usability... yes
checking ieee754.h presence... yes
checking for ieee754.h... yes
checking for sys/mount.h... yes
checking for sys/sysctl.h... yes
checking for net/if.h... yes

============================================================================
== Type tests
============================================================================
checking for socklen_t... yes
checking for struct sockaddr_in... (cached) yes
checking for struct sockaddr_in6... yes
checking for struct sockaddr_storage... Selecting previously unselected package python3.8-minimal.
Preparing to unpack .../python3.8-minimal_3.8.10-0ubuntu1~20.04.9_amd64.deb ...
Unpacking python3.8-minimal (3.8.10-0ubuntu1~20.04.9) ...
yes
checking for struct ifreq... Setting up libpython3.8-minimal:amd64 (3.8.10-0ubuntu1~20.04.9) ...
yes
checking for struct ethtool_cmd... yes
Setting up python3.8-minimal (3.8.10-0ubuntu1~20.04.9) ...
checking whether ethtool_cmd_speed is declared... yes
checking whether SIOCETHTOOL is declared... yes
checking for struct ethtool_cmd.speed_hi... yes
checking for struct ethtool_cmd.speed_hi... (cached) yes
checking for struct ethtool_cmd.speed_hi... (cached) yes
checking for struct ethtool_cmd.speed_hi... (cached) yes
checking whether AF_UNSPEC is declared... yes
checking whether PF_UNSPEC is declared... yes
checking whether AF_INET6 is declared... yes
checking whether PF_INET6 is declared... yes
checking if SA_RESTART defined in signal.h... yes
checking for struct sockaddr.sa_len... no
checking for struct dirent.d_type... yes
checking for siginfo_t.si_fd... yes
checking for siginfo_t.si_band... yes
checking for struct statfs.f_type... yes
checking for struct statfs.f_fstypename... no
checking for struct statvfs.f_basetype... no
checking for struct statvfs.f_fstypename... no
checking for type of MPI_Aint... ptrdiff_t (size: 8)
checking for type of MPI_Count... long long (size: 8)
checking for type of MPI_Offset... long long (size: 8)
checking for an MPI datatype for MPI_Offset... MPI_LONG_LONG
checking the linker for support for the -fini option... yes

============================================================================
== Library and Function tests
============================================================================
checking for library containing openpty... -lutil
checking for library containing gethostbyname... none required
checking for library containing socket... none required
checking for library containing sched_yield... none required
checking for library containing dirname... none required
checking for library containing ceil... -lm
checking for library containing clock_gettime... Selecting previously unselected package python3-minimal.
(Reading database ... 
(Reading database ... 5%
(Reading database ... 10%
(Reading database ... 15%
(Reading database ... 20%
(Reading database ... 25%
(Reading database ... 30%
(Reading database ... 35%
(Reading database ... 40%
(Reading database ... 45%
(Reading database ... 50%
(Reading database ... 55%
(Reading database ... 60%
(Reading database ... 65%
(Reading database ... 70%
(Reading database ... 75%
(Reading database ... 80%
(Reading database ... 85%
(Reading database ... 90%
(Reading database ... 95%
(Reading database ... 100%
(Reading database ... 14615 files and directories currently installed.)
Preparing to unpack .../0-python3-minimal_3.8.2-0ubuntu2_amd64.deb ...
Unpacking python3-minimal (3.8.2-0ubuntu2) ...
none required
checking for asprintf... yes
checking for snprintf... yes
checking for vasprintf... Selecting previously unselected package mime-support.
Preparing to unpack .../1-mime-support_3.64ubuntu1_all.deb ...
Unpacking mime-support (3.64ubuntu1) ...
yes
checking for vsnprintf... yes
checking for openpty... yes
checking for isatty... Selecting previously unselected package libmpdec2:amd64.
Preparing to unpack .../2-libmpdec2_2.4.2-3_amd64.deb ...
Unpacking libmpdec2:amd64 (2.4.2-3) ...
Selecting previously unselected package libpython3.8-stdlib:amd64.
Preparing to unpack .../3-libpython3.8-stdlib_3.8.10-0ubuntu1~20.04.9_amd64.deb ...
Unpacking libpython3.8-stdlib:amd64 (3.8.10-0ubuntu1~20.04.9) ...
yes
checking for getpwuid... yes
checking for fork... yes
checking for waitpid... yes
checking for execve... yes
checking for pipe... yes
checking for ptsname... yes
checking for setsid... yes
checking for mmap... yes
checking for tcgetpgrp... yes
checking for posix_memalign... yes
checking for strsignal... yes
checking for sysconf... Selecting previously unselected package python3.8.
Preparing to unpack .../4-python3.8_3.8.10-0ubuntu1~20.04.9_amd64.deb ...
Unpacking python3.8 (3.8.10-0ubuntu1~20.04.9) ...
yes
checking for syslog... yes
checking for vsyslog... yes
checking for regcmp... no
Selecting previously unselected package libpython3-stdlib:amd64.
Preparing to unpack .../5-libpython3-stdlib_3.8.2-0ubuntu2_amd64.deb ...
Unpacking libpython3-stdlib:amd64 (3.8.2-0ubuntu2) ...
Setting up python3-minimal (3.8.2-0ubuntu2) ...
checking for regexec... yes
checking for regfree... yes
checking for _NSGetEnviron... no
checking for socketpair... yes
checking for strncpy_s... no
checking for usleep... yes
Selecting previously unselected package python3.
(Reading database ... 
checking for mkfifo... yes
checking for dbopen... no
checking for dbm_open... (Reading database ... 5%
(Reading database ... 10%
(Reading database ... 15%
(Reading database ... 20%
(Reading database ... 25%
(Reading database ... 30%
(Reading database ... 35%
(Reading database ... 40%
(Reading database ... 45%
(Reading database ... 50%
(Reading database ... 55%
(Reading database ... 60%
(Reading database ... 65%
(Reading database ... 70%
(Reading database ... 75%
(Reading database ... 80%
(Reading database ... 85%
(Reading database ... 90%
(Reading database ... 95%
(Reading database ... 100%
(Reading database ... 15017 files and directories currently installed.)
Preparing to unpack .../00-python3_3.8.2-0ubuntu2_amd64.deb ...
Unpacking python3 (3.8.2-0ubuntu2) ...
no
checking for statfs... yes
checking for statvfs... yes
checking for setpgid... yes
checking for setenv... Selecting previously unselected package libmagic-mgc.
Preparing to unpack .../01-libmagic-mgc_1%3a5.38-4_amd64.deb ...
Unpacking libmagic-mgc (1:5.38-4) ...
yes
checking for __malloc_initialize_hook... no
checking for __clear_cache... yes
checking for htonl define... no
checking for htonl... Selecting previously unselected package libmagic1:amd64.
Preparing to unpack .../02-libmagic1_1%3a5.38-4_amd64.deb ...
Unpacking libmagic1:amd64 (1:5.38-4) ...
yes
checking whether va_copy is declared... yes
checking whether __va_copy is declared... yes
checking whether __func__ is declared... yes
Selecting previously unselected package file.
Preparing to unpack .../03-file_1%3a5.38-4_amd64.deb ...
Unpacking file (1:5.38-4) ...
Selecting previously unselected package less.
Preparing to unpack .../04-less_551-1ubuntu0.2_amd64.deb ...
Unpacking less (551-1ubuntu0.2) ...

============================================================================
== System-specific tests
============================================================================
checking for _SC_NPROCESSORS_ONLN... yes
checking whether byte ordering is bigendian... no
checking for broken qsort... no
checking if C compiler and POSIX threads work as is... Selecting previously unselected package tzdata.
Preparing to unpack .../05-tzdata_2024a-0ubuntu0.20.04_all.deb ...
Unpacking tzdata (2024a-0ubuntu0.20.04) ...
no
checking if C++ compiler and POSIX threads work as is... no
checking if C compiler and POSIX threads work with -Kthread... no
checking if C compiler and POSIX threads work with -kthread... no
checking if C compiler and POSIX threads work with -pthread... yes
checking if C++ compiler and POSIX threads work with -Kthread... no
checking if C++ compiler and POSIX threads work with -kthread... no
checking if C++ compiler and POSIX threads work with -pthread... yes
checking for pthread_mutexattr_setpshared... yes
checking for pthread_condattr_setpshared... yes
checking for PTHREAD_MUTEX_ERRORCHECK_NP... yes
checking for PTHREAD_MUTEX_ERRORCHECK... yes
checking for working POSIX threads package... yes
checking if threads have different pids (pthreads on linux)... no
checking whether ln -s works... yes
checking for grep that handles long lines and -e... (cached) /usr/bin/grep
checking for egrep... (cached) /usr/bin/grep -E
checking dependency style of gcc... (cached) gcc3
checking for flex... no
checking for lex... no
checking for flavor of ps to use... ps -A -o fname,pid,uid
checking if build filesystem is case sensitive... yes
checking if configuring for case sensitive filesystem... yes
checking whether RLIMIT_NPROC is declared... yes
checking whether RLIMIT_MEMLOCK is declared... yes
checking whether RLIMIT_NOFILE is declared... yes
checking whether RLIMIT_FSIZE is declared... yes
checking whether RLIMIT_CORE is declared... yes
checking whether RLIMIT_STACK is declared... yes
checking whether RLIMIT_AS is declared... yes

============================================================================
== Modular Component Architecture (MCA) setup
============================================================================
checking for subdir args...  '--prefix=/opt/openmpi' '--with-cuda' 'CFLAGS=-w'
checking for pkg-config... no
checking --with-verbs value... simple ok (unspecified value)
checking --with-verbs-libdir value... simple ok (unspecified value)
checking for pkg-config... no
checking for X... no
 
==> Pre-emptively configuring the hwloc framework to satisfy dependencies.
checking whether to enable hwloc PCI device support... yes (default)

+++ Configuring MCA framework hwloc
checking for no configure components in framework hwloc... 
checking for m4 configure components in framework hwloc... external, hwloc201

--- MCA component hwloc:external (m4 configuration macro, priority 90)
checking for MCA component hwloc:external compile mode... static
checking --with-hwloc-libdir value... simple ok (unspecified value)
checking looking for external hwloc in... ()
checking hwloc.h usability... no
checking hwloc.h presence... no
checking for hwloc.h... no
checking if MCA component hwloc:external can compile... no

--- MCA component hwloc:hwloc201 (m4 configuration macro, priority 80)
checking for MCA component hwloc:hwloc201 compile mode... static
checking if hwloc external component succeeded... no
configure: hwloc:external failed, so this component will be used
checking hwloc building mode... embedded
configure: hwloc builddir: /scratch/build/opal/mca/hwloc/hwloc201/hwloc
configure: hwloc srcdir: /scratch/openmpi/opal/mca/hwloc/hwloc201/hwloc
configure: Detected VPATH build
checking for hwloc version... 2.0.2rc1-git
checking if want hwloc maintainer support... disabled (embedded mode)
checking for hwloc directory prefix... opal/mca/hwloc/hwloc201/hwloc/
checking for hwloc symbol prefix... opal_hwloc201_
checking for gcc option to accept ISO C99... none needed
checking size of void *... (cached) 8
checking which OS support to include... Linux
checking which CPU support to include... x86_64
checking size of unsigned long... 8
checking size of unsigned int... 4
checking for the C compiler vendor... gnu
checking for __attribute__... yes
checking for __attribute__(aligned)... yes
checking for __attribute__(always_inline)... yes
checking for __attribute__(cold)... yes
checking for __attribute__(const)... yes
checking for __attribute__(deprecated)... yes
checking for __attribute__(format)... no
checking for __attribute__(hot)... yes
checking for __attribute__(malloc)... yes
checking for __attribute__(may_alias)... yes
checking for __attribute__(no_instrument_function)... yes
checking for __attribute__(nonnull)... no
checking for __attribute__(noreturn)... yes
checking for __attribute__(packed)... yes
checking for __attribute__(pure)... yes
checking for __attribute__(sentinel)... no
checking for __attribute__(unused)... yes
checking for __attribute__(warn_unused_result)... no
checking for __attribute__(weak_alias)... yes
checking if gcc supports -fvisibility=hidden... yes
checking whether to enable symbol visibility... yes (via -fvisibility=hidden)
configure: WARNING: "-fvisibility=hidden" has been added to the hwloc CFLAGS
checking whether the C compiler rejects function calls with too many arguments... yes
checking whether the C compiler rejects function calls with too few arguments... yes
checking for unistd.h... (cached) yes
checking for dirent.h... (cached) yes
checking for strings.h... (cached) yes
checking ctype.h usability... yes
checking ctype.h presence... yes
checking for ctype.h... yes
checking for strncasecmp... yes
checking whether strncasecmp is declared... yes
checking whether function strncasecmp has a complete prototype... yes
checking for strftime... yes
checking for setlocale... yes
checking for stdint.h... (cached) yes
checking for sys/mman.h... (cached) yes
checking for KAFFINITY... no
checking for PROCESSOR_CACHE_TYPE... no
checking for CACHE_DESCRIPTOR... no
checking for LOGICAL_PROCESSOR_RELATIONSHIP... no
checking for RelationProcessorPackage... no
checking for SYSTEM_LOGICAL_PROCESSOR_INFORMATION... no
checking for GROUP_AFFINITY... no
checking for PROCESSOR_RELATIONSHIP... no
checking for NUMA_NODE_RELATIONSHIP... no
checking for CACHE_RELATIONSHIP... no
checking for PROCESSOR_GROUP_INFO... no
checking for GROUP_RELATIONSHIP... no
checking for SYSTEM_LOGICAL_PROCESSOR_INFORMATION_EX... no
checking for PSAPI_WORKING_SET_EX_BLOCK... no
checking for PSAPI_WORKING_SET_EX_INFORMATION... no
checking for PROCESSOR_NUMBER... no
checking for main in -lgdi32... no
checking for PostQuitMessage in -luser32... Selecting previously unselected package libicu66:amd64.
Preparing to unpack .../06-libicu66_66.1-2ubuntu2.1_amd64.deb ...
Unpacking libicu66:amd64 (66.1-2ubuntu2.1) ...
no
checking windows.h usability... no
checking windows.h presence... no
checking for windows.h... no
checking sys/lgrp_user.h usability... no
checking sys/lgrp_user.h presence... no
checking for sys/lgrp_user.h... no
checking kstat.h usability... no
checking kstat.h presence... no
checking for kstat.h... no
checking whether fabsf is declared... yes
checking for fabsf in -lm... yes
checking picl.h usability... no
checking picl.h presence... no
checking for picl.h... no
checking whether _SC_NPROCESSORS_ONLN is declared... yes
checking whether _SC_NPROCESSORS_CONF is declared... yes
checking whether _SC_NPROC_ONLN is declared... no
checking whether _SC_NPROC_CONF is declared... no
checking whether _SC_PAGESIZE is declared... yes
checking whether _SC_PAGE_SIZE is declared... yes
checking whether _SC_LARGE_PAGESIZE is declared... no
checking mach/mach_host.h usability... no
checking mach/mach_host.h presence... no
checking for mach/mach_host.h... no
checking mach/mach_init.h usability... no
checking mach/mach_init.h presence... no
checking for mach/mach_init.h... no
checking for sys/param.h... (cached) yes
checking for sys/sysctl.h... (cached) yes
checking whether CTL_HW is declared... no
checking whether HW_NCPU is declared... no
checking whether strtoull is declared... yes
checking for ssize_t... (cached) yes
checking whether snprintf is declared... yes
checking whether strcasecmp is declared... yes
checking whether _strdup is declared... no
checking whether _putenv is declared... no
checking whether getprogname is declared... no
checking whether getexecname is declared... no
checking whether GetModuleFileName is declared... no
checking for program_invocation_name... yes
checking for __progname... yes
checking for pthread_t... Selecting previously unselected package libxml2:amd64.
Preparing to unpack .../07-libxml2_2.9.10+dfsg-5ubuntu0.20.04.7_amd64.deb ...
Unpacking libxml2:amd64 (2.9.10+dfsg-5ubuntu0.20.04.7) ...
Selecting previously unselected package xxd.
Preparing to unpack .../08-xxd_2%3a8.1.2269-1ubuntu5.22_amd64.deb ...
Unpacking xxd (2:8.1.2269-1ubuntu5.22) ...
yes
checking whether sched_getcpu is declared... yes
checking whether sched_setaffinity is declared... yes
checking whether function sched_setaffinity has a complete prototype... yes
checking for old prototype of sched_setaffinity... no
checking for working CPU_SET... Selecting previously unselected package vim-common.
Preparing to unpack .../09-vim-common_2%3a8.1.2269-1ubuntu5.22_all.deb ...
Unpacking vim-common (2:8.1.2269-1ubuntu5.22) ...
yes
checking for working CPU_SET_S... yes
checking for working syscall with 6 parameters... yes
checking for lib... no
checking for bash... /bin/bash
checking for ffs... yes
checking whether ffs is declared... yes
checking whether function ffs has a complete prototype... yes
checking for ffsl... Selecting previously unselected package krb5-locales.
Preparing to unpack .../10-krb5-locales_1.17-6ubuntu4.4_all.deb ...
Unpacking krb5-locales (1.17-6ubuntu4.4) ...
Selecting previously unselected package libcbor0.6:amd64.
Preparing to unpack .../11-libcbor0.6_0.6.0-0ubuntu1_amd64.deb ...
Unpacking libcbor0.6:amd64 (0.6.0-0ubuntu1) ...
yes
checking whether ffsl is declared... yes
checking whether function ffsl has a complete prototype... yes
checking for fls... no
checking for flsl... Selecting previously unselected package libfido2-1:amd64.
Preparing to unpack .../12-libfido2-1_1.3.1-1ubuntu2_amd64.deb ...
Unpacking libfido2-1:amd64 (1.3.1-1ubuntu2) ...
no
checking for clz... no
checking for clzl... no
checking for openat... Selecting previously unselected package libkrb5support0:amd64.
Preparing to unpack .../13-libkrb5support0_1.17-6ubuntu4.4_amd64.deb ...
Unpacking libkrb5support0:amd64 (1.17-6ubuntu4.4) ...
yes
checking for malloc.h... (cached) yes
checking for getpagesize... yes
checking for memalign... yes
Selecting previously unselected package libk5crypto3:amd64.
Preparing to unpack .../14-libk5crypto3_1.17-6ubuntu4.4_amd64.deb ...
Unpacking libk5crypto3:amd64 (1.17-6ubuntu4.4) ...
Selecting previously unselected package libkeyutils1:amd64.
Preparing to unpack .../15-libkeyutils1_1.6-6ubuntu1.1_amd64.deb ...
Unpacking libkeyutils1:amd64 (1.6-6ubuntu1.1) ...
checking for posix_memalign... (cached) yes
checking for sys/utsname.h... (cached) yes
checking for uname... yes
checking pthread_np.h usability... no
checking pthread_np.h presence... no
checking for pthread_np.h... no
checking whether pthread_setaffinity_np is declared... Selecting previously unselected package libkrb5-3:amd64.
Preparing to unpack .../16-libkrb5-3_1.17-6ubuntu4.4_amd64.deb ...
yes
checking whether pthread_getaffinity_np is declared... yes
checking for sched_setaffinity... yes
checking for sys/cpuset.h... Unpacking libkrb5-3:amd64 (1.17-6ubuntu4.4) ...
no
checking for cpuset_setaffinity... no
checking for library containing pthread_getthrds_np... Selecting previously unselected package libgssapi-krb5-2:amd64.
Preparing to unpack .../17-libgssapi-krb5-2_1.17-6ubuntu4.4_amd64.deb ...
Unpacking libgssapi-krb5-2:amd64 (1.17-6ubuntu4.4) ...
no
checking for cpuset_setid... no
checking libudev.h usability... no
checking libudev.h presence... Selecting previously unselected package libnuma1:amd64.
Preparing to unpack .../18-libnuma1_2.0.12-1_amd64.deb ...
Unpacking libnuma1:amd64 (2.0.12-1) ...
no
checking for libudev.h... no
checking for PCIACCESS... cannot check without pkg-config
checking pciaccess.h usability... no
checking pciaccess.h presence... Selecting previously unselected package libpsl5:amd64.
Preparing to unpack .../19-libpsl5_0.21.0-1ubuntu1_amd64.deb ...
Unpacking libpsl5:amd64 (0.21.0-1ubuntu1) ...
no
checking for pciaccess.h... no
checking X11/Xlib.h usability... no
checking X11/Xlib.h presence... no
checking for X11/Xlib.h... no
checking for x86 cpuid... Selecting previously unselected package libxmuu1:amd64.
Preparing to unpack .../20-libxmuu1_2%3a1.1.3-0ubuntu1_amd64.deb ...
Unpacking libxmuu1:amd64 (2:1.1.3-0ubuntu1) ...
yes
checking for pthread_mutex_lock... yes
checking if plugin support is enabled... no
checking components to build statically...  noos xml synthetic xml_nolibxml linux linuxio x86
checking components to build as plugins... 
checking whether hwloc configure succeeded... yes
checking infiniband/verbs.h usability... no
checking infiniband/verbs.h presence... no
checking for infiniband/verbs.h... no
checking if MCA component hwloc:hwloc201 can compile... yes
Selecting previously unselected package openssh-client.
Preparing to unpack .../21-openssh-client_1%3a8.2p1-4ubuntu0.11_amd64.deb ...
Unpacking openssh-client (1:8.2p1-4ubuntu0.11) ...
 
checking for winning hwloc component header file... opal/mca/hwloc/hwloc201/hwloc201.h
 
<== Pre-emptive hwloc framework configuration complete.
<== We now return you to your regularly scheduled programming.
 
checking which components should be disabled... 
checking which components should be direct-linked into the library... 
checking which components should be run-time loadable... all
checking which components should be static... none
checking for projects containing MCA frameworks... opal, orte, ompi, oshmem

*** Configuring MCA for opal
checking for frameworks for opal... common, allocator, backtrace, btl, compress, crs, dl, event, hwloc, if, installdirs, memchecker, memcpy, memory, mpool, patcher, pmix, pstat, rcache, reachable, shmem, timer

+++ Configuring MCA framework common
checking for no configure components in framework common... 
checking for m4 configure components in framework common... cuda, sm, ucx, verbs, verbs_usnic

--- MCA component common:cuda (m4 configuration macro)
checking for MCA component common:cuda compile mode... dso
checking if MCA component common:cuda can compile... yes

--- MCA component common:sm (m4 configuration macro)
checking for MCA component common:sm compile mode... dso
checking if MCA component common:sm can compile... yes

--- MCA component common:ucx (m4 configuration macro)
checking for MCA component common:ucx compile mode... dso
checking --with-ucx value... simple ok (unspecified value)
checking --with-ucx-libdir value... simple ok (unspecified value)
checking for ucx... no
checking ucp/api/ucp.h usability... no
checking ucp/api/ucp.h presence... no
checking for ucp/api/ucp.h... no
checking ucp/api/ucp.h usability... no
checking ucp/api/ucp.h presence... Selecting previously unselected package publicsuffix.
Preparing to unpack .../22-publicsuffix_20200303.0012-1_all.deb ...
Unpacking publicsuffix (20200303.0012-1) ...
no
checking for ucp/api/ucp.h... no
checking whether ucp_tag_send_nbr is declared... no
checking whether ucp_ep_flush_nb is declared... Selecting previously unselected package wget.
Preparing to unpack .../23-wget_1.20.3-1ubuntu2_amd64.deb ...
Unpacking wget (1.20.3-1ubuntu2) ...
no
checking whether ucp_worker_flush_nb is declared... no
checking whether ucp_request_check_status is declared... no
checking whether ucp_put_nb is declared... no
checking whether ucp_get_nb is declared... no
checking whether ucm_test_events is declared... no
checking whether UCP_ATOMIC_POST_OP_AND is declared... no
checking whether UCP_ATOMIC_POST_OP_OR is declared... Selecting previously unselected package xauth.
Preparing to unpack .../24-xauth_1%3a1.1-0ubuntu1_amd64.deb ...
Unpacking xauth (1:1.1-0ubuntu1) ...
no
checking whether UCP_ATOMIC_POST_OP_XOR is declared... no
checking whether UCP_ATOMIC_FETCH_OP_FAND is declared... no
checking whether UCP_ATOMIC_FETCH_OP_FOR is declared... no
checking whether UCP_ATOMIC_FETCH_OP_FXOR is declared... no
checking whether UCP_PARAM_FIELD_ESTIMATED_NUM_PPN is declared... no
checking whether UCP_WORKER_ATTR_FIELD_ADDRESS_FLAGS is declared... Selecting previously unselected package libsigsegv2:amd64.
Preparing to unpack .../25-libsigsegv2_2.12-2_amd64.deb ...
Unpacking libsigsegv2:amd64 (2.12-2) ...
no
checking if MCA component common:ucx can compile... no

--- MCA component common:verbs (m4 configuration macro)
checking for MCA component common:verbs compile mode... dso
checking if want to add padding to the openib BTL control header... no
checking for fcntl.h... (cached) yes
checking sys/poll.h usability... yes
checking sys/poll.h presence... yes
checking for sys/poll.h... yes
checking infiniband/verbs.h usability... no
checking infiniband/verbs.h presence... no
checking for infiniband/verbs.h... no
checking if ConnectX XRC support is enabled... no
checking if ConnectIB XRC support is enabled... no
checking if dynamic SL is enabled... no
Selecting previously unselected package m4.
Preparing to unpack .../26-m4_1.4.18-4_amd64.deb ...
Unpacking m4 (1.4.18-4) ...
checking if MCA component common:verbs can compile... no

--- MCA component common:verbs_usnic (m4 configuration macro)
checking for MCA component common:verbs_usnic compile mode... static
checking if MCA component common:verbs_usnic can compile... no

+++ Configuring MCA framework allocator
checking for no configure components in framework allocator... basic, bucket
checking for m4 configure components in framework allocator... 

--- MCA component allocator:basic (no configuration)
checking for MCA component allocator:basic compile mode... dso
checking if MCA component allocator:basic can compile... yes

--- MCA component allocator:bucket (no configuration)
checking for MCA component allocator:bucket compile mode... dso
checking if MCA component allocator:bucket can compile... yes

+++ Configuring MCA framework backtrace
checking for no configure components in framework backtrace... 
checking for m4 configure components in framework backtrace... execinfo, none, printstack

--- MCA component backtrace:execinfo (m4 configuration macro, priority 30)
checking for MCA component backtrace:execinfo compile mode... static
checking for execinfo.h... (cached) yes
checking for library containing backtrace... Selecting previously unselected package autoconf.
Preparing to unpack .../27-autoconf_2.69-11.1_all.deb ...
Unpacking autoconf (2.69-11.1) ...
none required
checking if MCA component backtrace:execinfo can compile... yes

--- MCA component backtrace:printstack (m4 configuration macro, priority 30)
checking for MCA component backtrace:printstack compile mode... static
checking ucontext.h usability... yes
checking ucontext.h presence... yes
checking for ucontext.h... yes
checking for printstack... no
checking if MCA component backtrace:printstack can compile... no

--- MCA component backtrace:none (m4 configuration macro, priority 0)
checking for MCA component backtrace:none compile mode... static
checking if MCA component backtrace:none can compile... no

+++ Configuring MCA framework btl
checking for no configure components in framework btl... self
checking for m4 configure components in framework btl... openib, portals4, sm, smcuda, tcp, uct, ugni, usnic, vader

--- MCA component btl:self (no configuration)
checking for MCA component btl:self compile mode... dso
checking if MCA component btl:self can compile... yes

--- MCA component btl:openib (m4 configuration macro)
checking for MCA component btl:openib compile mode... dso
checking whether expanded verbs are available... no
checking whether IBV_EXP_ATOMIC_HCA_REPLY_BE is declared... Selecting previously unselected package autotools-dev.
Preparing to unpack .../28-autotools-dev_20180224.1_all.deb ...
Unpacking autotools-dev (20180224.1) ...
no
checking whether IBV_EXP_QP_CREATE_ATOMIC_BE_REPLY is declared... no
checking whether ibv_exp_create_qp is declared... no
checking whether ibv_exp_query_device is declared... no
checking whether IBV_EXP_QP_INIT_ATTR_ATOMICS_ARG is declared... no
checking for struct ibv_exp_device_attr.ext_atom... no
checking for struct ibv_exp_device_attr.exp_atomic_cap... Selecting previously unselected package automake.
Preparing to unpack .../29-automake_1%3a1.16.1-4ubuntu6_all.deb ...
Unpacking automake (1:1.16.1-4ubuntu6) ...
no
checking if MCA component btl:openib can compile... no

--- MCA component btl:portals4 (m4 configuration macro)
checking for MCA component btl:portals4 compile mode... dso
checking --with-portals4 value... simple ok (unspecified value)
checking --with-portals4-libdir value... simple ok (unspecified value)
checking portals4.h usability... no
checking portals4.h presence... no
checking for portals4.h... no
checking whether to enable flow control... no
checking if MCA component btl:portals4 can compile... no

--- MCA component btl:sm (m4 configuration macro)
checking for MCA component btl:sm compile mode... dso
checking if MCA component btl:sm can compile... yes

--- MCA component btl:smcuda (m4 configuration macro)
checking for MCA component btl:smcuda compile mode... dso
checking if MCA component btl:smcuda can compile... yes

--- MCA component btl:tcp (m4 configuration macro)
checking for MCA component btl:tcp compile mode... dso
checking for struct sockaddr_in... (cached) yes
checking if MCA component btl:tcp can compile... yes

--- MCA component btl:uct (m4 configuration macro)
checking for MCA component btl:uct compile mode... dso
checking if MCA component btl:uct can compile... no

--- MCA component btl:ugni (m4 configuration macro)
checking for MCA component btl:ugni compile mode... dso
checking for CRAY_UGNI... no
checking if MCA component btl:ugni can compile... no

--- MCA component btl:usnic (m4 configuration macro)
checking for MCA component btl:usnic compile mode... dso
checking size of void *... (cached) 8
checking for 64 bit Linux... yes
checking --with-ofi value... simple ok (unspecified value)
checking --with-ofi-libdir value... simple ok (unspecified value)
checking looking for OFI libfabric in... ()
checking rdma/fabric.h usability... no
checking rdma/fabric.h presence... no
checking for rdma/fabric.h... no
checking if MCA component btl:usnic can compile... no

--- MCA component btl:vader (m4 configuration macro)
checking for MCA component btl:vader compile mode... dso
checking for Cray XPMEM support... checking for CRAY_XPMEM... no
checking --with-xpmem value... simple ok (unspecified value)
checking --with-xpmem-libdir value... simple ok (unspecified value)
checking xpmem.h usability... no
checking xpmem.h presence... no
checking for xpmem.h... no
Selecting previously unselected package bc.
Preparing to unpack .../30-bc_1.07.1-2build1_amd64.deb ...
Unpacking bc (1.07.1-2build1) ...
Selecting previously unselected package ccache.
Preparing to unpack .../31-ccache_3.7.7-1_amd64.deb ...
Unpacking ccache (3.7.7-1) ...
checking --with-knem value... simple ok (unspecified value)
checking knem_io.h usability... no
checking knem_io.h presence... no
checking for knem_io.h... no
checking sys/prctl.h usability... yes
checking sys/prctl.h presence... yes
checking for sys/prctl.h... yes
checking for process_vm_readv... yes
Selecting previously unselected package gcc-8-base:amd64.
Preparing to unpack .../32-gcc-8-base_8.4.0-3ubuntu2_amd64.deb ...
Unpacking gcc-8-base:amd64 (8.4.0-3ubuntu2) ...
checking for sys/prctl.h... (cached) yes
checking if MCA component btl:vader can compile... yes

+++ Configuring MCA framework compress
checking for no configure components in framework compress... bzip, gzip
checking for m4 configure components in framework compress... 

--- MCA component compress:bzip (no configuration)
checking for MCA component compress:bzip compile mode... dso
checking if MCA component compress:bzip can compile... yes

--- MCA component compress:gzip (no configuration)
checking for MCA component compress:gzip compile mode... dso
checking if MCA component compress:gzip can compile... yes

+++ Configuring MCA framework crs
checking for no configure components in framework crs... none
checking for m4 configure components in framework crs... self

--- MCA component crs:none (no configuration)
checking for MCA component crs:none compile mode... dso
checking if MCA component crs:none can compile... yes

--- MCA component crs:self (m4 configuration macro)
checking for MCA component crs:self compile mode... dso
checking if MCA component crs:self can compile... no

+++ Configuring MCA framework dl
checking for no configure components in framework dl... 
checking for m4 configure components in framework dl... dlopen, libltdl

--- MCA component dl:dlopen (m4 configuration macro, priority 80)
checking for MCA component dl:dlopen compile mode... static
checking dlfcn.h usability... yes
checking dlfcn.h presence... Selecting previously unselected package cpp-8.
Preparing to unpack .../33-cpp-8_8.4.0-3ubuntu2_amd64.deb ...
Unpacking cpp-8 (8.4.0-3ubuntu2) ...
yes
checking for dlfcn.h... yes
looking for library without search path
checking for library containing dlopen... -ldl
checking if libdl requires libnl v1 or v3... 
checking if MCA component dl:dlopen can compile... yes

--- MCA component dl:libltdl (m4 configuration macro, priority 50)
checking for MCA component dl:libltdl compile mode... static
checking --with-libltdl value... simple ok (unspecified value)
checking --with-libltdl-libdir value... simple ok (unspecified value)
checking for libltdl dir... compiler default
checking for libltdl library dir... linker default
checking ltdl.h usability... no
checking ltdl.h presence... no
checking for ltdl.h... no
checking if MCA component dl:libltdl can compile... no

+++ Configuring MCA framework event
checking for no configure components in framework event... 
checking for m4 configure components in framework event... external, libevent2022

--- MCA component event:external (m4 configuration macro, priority 90)
checking for MCA component event:external compile mode... static
checking --with-libevent-libdir value... simple ok (unspecified value)
checking for external libevent in... (default search paths)
checking event2/event.h usability... no
checking event2/event.h presence... no
checking for event2/event.h... no
checking if MCA component event:external can compile... no

--- MCA component event:libevent2022 (m4 configuration macro, priority 80)
checking for MCA component event:libevent2022 compile mode... static
checking if event external component succeeded... no
configure: event:external failed, so this component will be used
checking libevent configuration args... --disable-dns --disable-http --disable-rpc --disable-openssl --enable-thread-support --disable-evport
configure: OPAL configuring in opal/mca/event/libevent2022/libevent
configure: running /bin/bash '../../../../../../openmpi/opal/mca/event/libevent2022/libevent/configure' --disable-dns --disable-http --disable-rpc --disable-openssl --enable-thread-support --disable-evport  '--prefix=/opt/openmpi' '--with-cuda' 'CFLAGS=-w' 'CPPFLAGS=-I/scratch/openmpi -I/scratch/build -I/scratch/openmpi/opal/include -I/scratch/build/opal/mca/hwloc/hwloc201/hwloc/include -I/scratch/openmpi/opal/mca/hwloc/hwloc201/hwloc/include   -Drandom=opal_random' --cache-file=/dev/null --srcdir=../../../../../../openmpi/opal/mca/event/libevent2022/libevent --disable-option-checking
checking for a BSD-compatible install... /usr/bin/install -c
checking whether build environment is sane... yes
checking for a thread-safe mkdir -p... /usr/bin/mkdir -p
checking for gawk... no
checking for mawk... mawk
checking whether make sets $(MAKE)... yes
checking whether make supports nested variables... yes
checking build system type... x86_64-unknown-linux-gnu
checking host system type... x86_64-unknown-linux-gnu
checking for gcc... gcc
checking whether the C compiler works... yes
checking for C compiler default output file name... a.out
checking for suffix of executables... 
checking whether we are cross compiling... no
checking for suffix of object files... o
checking whether we are using the GNU C compiler... yes
checking whether gcc accepts -g... Selecting previously unselected package libmpx2:amd64.
Preparing to unpack .../34-libmpx2_8.4.0-3ubuntu2_amd64.deb ...
Unpacking libmpx2:amd64 (8.4.0-3ubuntu2) ...
Selecting previously unselected package libgcc-8-dev:amd64.
Preparing to unpack .../35-libgcc-8-dev_8.4.0-3ubuntu2_amd64.deb ...
Unpacking libgcc-8-dev:amd64 (8.4.0-3ubuntu2) ...
yes
checking for gcc option to accept ISO C89... none needed
checking whether gcc understands -c and -o together... yes
checking for style of include used by make... GNU
checking dependency style of gcc... gcc3
checking for a sed that does not truncate output... /usr/bin/sed
checking whether ln -s works... yes
checking how to run the C preprocessor... gcc -E
checking for grep that handles long lines and -e... /usr/bin/grep
checking for egrep... /usr/bin/grep -E
checking whether gcc needs -traditional... no
checking how to print strings... printf
checking for a sed that does not truncate output... (cached) /usr/bin/sed
checking for fgrep... /usr/bin/grep -F
checking for ld used by gcc... /usr/bin/ld
checking if the linker (/usr/bin/ld) is GNU ld... yes
checking for BSD- or MS-compatible name lister (nm)... /usr/bin/nm -B
checking the name lister (/usr/bin/nm -B) interface... BSD nm
checking the maximum length of command line arguments... 1572864
checking how to convert x86_64-unknown-linux-gnu file names to x86_64-unknown-linux-gnu format... func_convert_file_noop
checking how to convert x86_64-unknown-linux-gnu file names to toolchain format... func_convert_file_noop
checking for /usr/bin/ld option to reload object files... -r
checking for objdump... objdump
checking how to recognize dependent libraries... pass_all
checking for dlltool... no
checking how to associate runtime and link libraries... printf %s\n
checking for ar... ar
checking for archiver @FILE support... @
checking for strip... strip
checking for ranlib... ranlib
checking command to parse /usr/bin/nm -B output from gcc object... ok
checking for sysroot... no
checking for a working dd... /usr/bin/dd
checking how to truncate binary pipes... /usr/bin/dd bs=4096 count=1
../../../../../../openmpi/opal/mca/event/libevent2022/libevent/configure: line 7292: /usr/bin/file: No such file or directory
checking for mt... no
checking if : is a manifest tool... no
checking for ANSI C header files... yes
checking for sys/types.h... yes
checking for sys/stat.h... yes
checking for stdlib.h... yes
checking for string.h... yes
checking for memory.h... yes
checking for strings.h... Selecting previously unselected package gcc-8.
Preparing to unpack .../36-gcc-8_8.4.0-3ubuntu2_amd64.deb ...
Unpacking gcc-8 (8.4.0-3ubuntu2) ...
yes
checking for inttypes.h... yes
checking for stdint.h... yes
checking for unistd.h... yes
checking for dlfcn.h... yes
checking for objdir... .libs
checking if gcc supports -fno-rtti -fno-exceptions... yes
checking for gcc option to produce PIC... -fPIC -DPIC
checking if gcc PIC flag -fPIC -DPIC works... yes
checking if gcc static flag -static works... yes
checking if gcc supports -c -o file.o... yes
checking if gcc supports -c -o file.o... (cached) yes
checking whether the gcc linker (/usr/bin/ld) supports shared libraries... yes
checking whether -lc should be explicitly linked in... no
checking dynamic linker characteristics... GNU/Linux ld.so
checking how to hardcode library paths into programs... immediate
checking whether stripping libraries is possible... yes
checking if libtool supports shared libraries... yes
checking whether to build shared libraries... yes
checking whether to build static libraries... yes
checking for library containing inet_ntoa... none required
checking for library containing socket... none required
checking for library containing inet_aton... none required
checking for library containing clock_gettime... none required
checking for library containing sendfile... none required
checking for WIN32... no
checking for CYGWIN... no
checking zlib.h usability... no
checking zlib.h presence... no
checking for zlib.h... no
checking openssl/bio.h usability... no
checking openssl/bio.h presence... no
checking for openssl/bio.h... no
checking for ANSI C header files... (cached) yes
checking fcntl.h usability... yes
checking fcntl.h presence... yes
checking for fcntl.h... yes
checking stdarg.h usability... yes
checking stdarg.h presence... yes
checking for stdarg.h... yes
checking for inttypes.h... (cached) yes
checking for stdint.h... (cached) yes
checking stddef.h usability... yes
checking stddef.h presence... yes
checking for stddef.h... yes
checking poll.h usability... yes
checking poll.h presence... yes
checking for poll.h... yes
checking for unistd.h... (cached) yes
checking sys/epoll.h usability... yes
checking sys/epoll.h presence... yes
checking for sys/epoll.h... yes
checking sys/time.h usability... yes
checking sys/time.h presence... yes
checking for sys/time.h... yes
checking sys/queue.h usability... yes
checking sys/queue.h presence... Selecting previously unselected package libgfortran5:amd64.
Preparing to unpack .../37-libgfortran5_10.5.0-1ubuntu1~20.04_amd64.deb ...
Unpacking libgfortran5:amd64 (10.5.0-1ubuntu1~20.04) ...
Selecting previously unselected package libgfortran-9-dev:amd64.
Preparing to unpack .../38-libgfortran-9-dev_9.4.0-1ubuntu1~20.04.2_amd64.deb ...
Unpacking libgfortran-9-dev:amd64 (9.4.0-1ubuntu1~20.04.2) ...
yes
checking for sys/queue.h... yes
checking sys/event.h usability... no
checking sys/event.h presence... no
checking for sys/event.h... no
checking sys/param.h usability... yes
checking sys/param.h presence... yes
checking for sys/param.h... yes
checking sys/ioctl.h usability... yes
checking sys/ioctl.h presence... yes
checking for sys/ioctl.h... yes
Selecting previously unselected package gfortran-9.
Preparing to unpack .../39-gfortran-9_9.4.0-1ubuntu1~20.04.2_amd64.deb ...
Unpacking gfortran-9 (9.4.0-1ubuntu1~20.04.2) ...
checking sys/select.h usability... yes
checking sys/select.h presence... yes
checking for sys/select.h... yes
checking sys/devpoll.h usability... no
checking sys/devpoll.h presence... no
checking for sys/devpoll.h... no
checking port.h usability... no
checking port.h presence... no
checking for port.h... no
checking netinet/in.h usability... yes
checking netinet/in.h presence... yes
checking for netinet/in.h... yes
checking netinet/in6.h usability... no
checking netinet/in6.h presence... no
checking for netinet/in6.h... no
checking sys/socket.h usability... yes
checking sys/socket.h presence... yes
checking for sys/socket.h... yes
checking sys/uio.h usability... yes
checking sys/uio.h presence... yes
checking for sys/uio.h... yes
checking arpa/inet.h usability... yes
checking arpa/inet.h presence... yes
checking for arpa/inet.h... yes
checking sys/eventfd.h usability... yes
checking sys/eventfd.h presence... yes
checking for sys/eventfd.h... yes
checking sys/mman.h usability... yes
checking sys/mman.h presence... yes
checking for sys/mman.h... yes
checking sys/sendfile.h usability... yes
checking sys/sendfile.h presence... yes
checking for sys/sendfile.h... yes
checking sys/wait.h usability... yes
checking sys/wait.h presence... yes
checking for sys/wait.h... yes
checking netdb.h usability... yes
checking netdb.h presence... yes
checking for netdb.h... yes
checking for sys/stat.h... (cached) yes
checking for sys/sysctl.h... yes
checking for TAILQ_FOREACH in sys/queue.h... yes
checking for timeradd in sys/time.h... yes
checking for timercmp in sys/time.h... yes
checking for timerclear in sys/time.h... yes
checking for timerisset in sys/time.h... yes
checking whether CTL_KERN is declared... yes
checking whether KERN_RANDOM is declared... yes
checking whether RANDOM_UUID is declared... yes
checking whether KERN_ARND is declared... no
checking for an ANSI C-conforming const... yes
checking for inline... inline
checking whether time.h and sys/time.h may both be included... yes
checking for gettimeofday... yes
checking for vasprintf... yes
checking for fcntl... yes
checking for clock_gettime... yes
checking for strtok_r... yes
checking for strsep... yes
checking for getnameinfo... yes
checking for strlcpy... no
checking for inet_ntop... yes
checking for inet_pton... Selecting previously unselected package gfortran.
Preparing to unpack .../40-gfortran_4%3a9.3.0-1ubuntu2_amd64.deb ...
Unpacking gfortran (4:9.3.0-1ubuntu2) ...
Selecting previously unselected package libgfortran-8-dev:amd64.
Preparing to unpack .../41-libgfortran-8-dev_8.4.0-3ubuntu2_amd64.deb ...
Unpacking libgfortran-8-dev:amd64 (8.4.0-3ubuntu2) ...
Selecting previously unselected package gfortran-8.
Preparing to unpack .../42-gfortran-8_8.4.0-3ubuntu2_amd64.deb ...
Unpacking gfortran-8 (8.4.0-3ubuntu2) ...
yes
checking for signal... yes
checking for sigaction... yes
checking for strtoll... yes
checking for inet_aton... yes
checking for pipe... yes
checking for eventfd... yes
checking for sendfile... yes
checking for mmap... yes
checking for splice... yes
checking for arc4random... no
checking for arc4random_buf... no
checking for issetugid... no
checking for geteuid... yes
checking for getegid... yes
checking for getprotobynumber... yes
checking for setenv... yes
checking for unsetenv... Selecting previously unselected package libbrotli1:amd64.
Preparing to unpack .../43-libbrotli1_1.0.7-6ubuntu0.1_amd64.deb ...
Unpacking libbrotli1:amd64 (1.0.7-6ubuntu0.1) ...
yes
checking for putenv... yes
checking for sysctl... yes
checking for umask... yes
checking for getaddrinfo... Selecting previously unselected package libnghttp2-14:amd64.
Preparing to unpack .../44-libnghttp2-14_1.40.0-1ubuntu0.2_amd64.deb ...
Unpacking libnghttp2-14:amd64 (1.40.0-1ubuntu0.2) ...
yes
checking size of long... 8
checking for F_SETFD in fcntl.h... yes
checking for select... yes
checking for select support... yes
checking for poll... Selecting previously unselected package librtmp1:amd64.
Preparing to unpack .../45-librtmp1_2.4+20151223.gitfa8646d.1-2build1_amd64.deb ...
Unpacking librtmp1:amd64 (2.4+20151223.gitfa8646d.1-2build1) ...
Selecting previously unselected package libssh-4:amd64.
Preparing to unpack .../46-libssh-4_0.9.3-2ubuntu2.5_amd64.deb ...
yes
checking for poll support... yes
checking for /dev/poll support... no
checking for epoll support... enabled
checking for epoll_ctl... yes
checking if epoll can build... yes
checking for working epoll library interface... yes
checking for epoll syscall support... no
checking for evport support... no
checking event_ops... yes
checking for working ops... yes
checking for pid_t... yes
checking for size_t... Unpacking libssh-4:amd64 (0.9.3-2ubuntu2.5) ...
Selecting previously unselected package libcurl3-gnutls:amd64.
Preparing to unpack .../47-libcurl3-gnutls_7.68.0-1ubuntu2.21_amd64.deb ...
Unpacking libcurl3-gnutls:amd64 (7.68.0-1ubuntu2.21) ...
yes
checking for ssize_t... yes
checking for uint64_t... yes
checking for uint32_t... Selecting previously unselected package liberror-perl.
Preparing to unpack .../48-liberror-perl_0.17029-1_all.deb ...
Unpacking liberror-perl (0.17029-1) ...
yes
checking for uint16_t... yes
checking for uint8_t... yes
checking for uintptr_t... yes
checking for fd_mask... Selecting previously unselected package git-man.
Preparing to unpack .../49-git-man_1%3a2.25.1-1ubuntu3.11_all.deb ...
Unpacking git-man (1:2.25.1-1ubuntu3.11) ...
yes
checking size of long long... 8
checking size of long... (cached) 8
checking size of int... 4
checking size of short... Selecting previously unselected package git.
Preparing to unpack .../50-git_1%3a2.25.1-1ubuntu3.11_amd64.deb ...
Unpacking git (1:2.25.1-1ubuntu3.11) ...
2
checking size of size_t... 8
checking size of void *... 8
checking for struct in6_addr... yes
checking for struct sockaddr_in6... yes
checking for sa_family_t... yes
checking for struct addrinfo... yes
checking for struct sockaddr_storage... yes
checking for struct in6_addr.s6_addr32... yes
checking for struct in6_addr.s6_addr16... yes
checking for struct sockaddr_in.sin_len... no
checking for struct sockaddr_in6.sin6_len... no
checking for struct sockaddr_storage.ss_family... yes
checking for struct sockaddr_storage.__ss_family... no
checking for socklen_t... yes
checking whether our compiler supports __func__... yes
checking for the pthreads library -lpthreads... no
checking whether pthreads work without any flags... no
checking whether pthreads work with -Kthread... no
checking whether pthreads work with -kthread... no
checking for the pthreads library -llthread... no
checking whether pthreads work with -pthread... yes
checking for joinable pthread attribute... PTHREAD_CREATE_JOINABLE
checking if more special flags are required for pthreads... no
checking size of pthread_t... 8
checking that generated files are newer than configure... done
configure: creating ./config.status
Selecting previously unselected package googletest.
Preparing to unpack .../51-googletest_1.10.0-2_all.deb ...
Unpacking googletest (1.10.0-2) ...
config.status: creating libevent.pc
config.status: creating libevent_openssl.pc
config.status: creating libevent_pthreads.pc
config.status: creating Makefile
config.status: creating include/Makefile
config.status: creating config.h
config.status: executing depfiles commands
config.status: executing libtool commands
configure: /bin/bash '../../../../../../openmpi/opal/mca/event/libevent2022/libevent/configure' succeeded for opal/mca/event/libevent2022/libevent
checking if MCA component event:libevent2022 can compile... yes
 
checking if have working event ops for the event framework... yes
checking for winning event component header file... libevent2022/libevent2022.h

+++ Configuring MCA framework if
checking for no configure components in framework if... 
checking for m4 configure components in framework if... bsdx_ipv4, bsdx_ipv6, linux_ipv6, posix_ipv4, solaris_ipv6

--- MCA component if:bsdx_ipv4 (m4 configuration macro)
checking for MCA component if:bsdx_ipv4 compile mode... static
checking struct sockaddr... yes (cached)
checking NetBSD, FreeBSD, OpenBSD, or DragonFly... no
checking if MCA component if:bsdx_ipv4 can compile... no

--- MCA component if:bsdx_ipv6 (m4 configuration macro)
checking for MCA component if:bsdx_ipv6 compile mode... static
checking struct sockaddr... yes (cached)
checking some flavor of BSD... no
checking if MCA component if:bsdx_ipv6 can compile... no

--- MCA component if:linux_ipv6 (m4 configuration macro)
checking for MCA component if:linux_ipv6 compile mode... static
checking if we are on Linux with TCP... yes
checking if MCA component if:linux_ipv6 can compile... yes

--- MCA component if:posix_ipv4 (m4 configuration macro)
checking for MCA component if:posix_ipv4 compile mode... static
checking struct sockaddr... yes (cached)
checking not NetBSD, FreeBSD, OpenBSD, or DragonFly... yes
checking for struct ifreq.ifr_hwaddr... Selecting previously unselected package libnl-3-200:amd64.
yes
checking for struct ifreq.ifr_mtu... yes
checking if MCA component if:posix_ipv4 can compile... yes

--- MCA component if:solaris_ipv6 (m4 configuration macro)
checking for MCA component if:solaris_ipv6 compile mode... static
checking if MCA component if:solaris_ipv6 can compile... no

+++ Configuring MCA framework installdirs
checking for no configure components in framework installdirs... 
checking for m4 configure components in framework installdirs... config, env

--- MCA component installdirs:env (m4 configuration macro, priority 10)
checking for MCA component installdirs:env compile mode... static
checking if MCA component installdirs:env can compile... yes

--- MCA component installdirs:config (m4 configuration macro, priority 0)
checking for MCA component installdirs:config compile mode... static
checking if MCA component installdirs:config can compile... yes

+++ Pre-configuring the framework memchecker
checking if --enable-memchecker was specified... no (adding "memchecker" to --enable-mca-no-build list)

+++ Configuring MCA framework memchecker
checking for no configure components in framework memchecker... 
checking for m4 configure components in framework memchecker... valgrind

--- MCA component memchecker:valgrind (m4 configuration macro, priority 10)
checking for MCA component memchecker:valgrind compile mode... static
checking --with-valgrind value... simple ok (unspecified value)
checking valgrind/valgrind.h usability... Preparing to unpack .../52-libnl-3-200_3.4.0-1ubuntu0.1_amd64.deb ...
Unpacking libnl-3-200:amd64 (3.4.0-1ubuntu0.1) ...
no
checking valgrind/valgrind.h presence... no
checking for valgrind/valgrind.h... no
configure: WARNING: valgrind.h not found
configure: WARNING: Cannot compile this component
checking if MCA component memchecker:valgrind can compile... no

+++ Configuring MCA framework memcpy
checking for no configure components in framework memcpy... 
checking for m4 configure components in framework memcpy... 

+++ Configuring MCA framework memory
checking for no configure components in framework memory... 
checking for m4 configure components in framework memory... malloc_solaris, patcher

--- MCA component memory:patcher (m4 configuration macro, priority 41)
checking for MCA component memory:patcher compile mode... static
checking for __curbrk... Selecting previously unselected package libnl-route-3-200:amd64.
Preparing to unpack .../53-libnl-route-3-200_3.4.0-1ubuntu0.1_amd64.deb ...
Unpacking libnl-route-3-200:amd64 (3.4.0-1ubuntu0.1) ...
yes
checking linux/mman.h usability... yes
checking linux/mman.h presence... yes
checking for linux/mman.h... yes
checking sys/syscall.h usability... yes
checking sys/syscall.h presence... yes
checking for sys/syscall.h... yes
checking whether __syscall is declared... Selecting previously unselected package libibverbs1:amd64.
Preparing to unpack .../54-libibverbs1_28.0-1ubuntu1_amd64.deb ...
Unpacking libibverbs1:amd64 (28.0-1ubuntu1) ...
no
checking for __syscall... no
checking if MCA component memory:patcher can compile... yes

--- MCA component memory:malloc_solaris (m4 configuration macro, priority 0)
checking for MCA component memory:malloc_solaris compile mode... static
checking for Solaris... no
checking if MCA component memory:malloc_solaris can compile... no

+++ Configuring MCA framework mpool
checking for no configure components in framework mpool... hugepage
checking for m4 configure components in framework mpool... memkind

--- MCA component mpool:hugepage (no configuration)
checking for MCA component mpool:hugepage compile mode... dso
checking if MCA component mpool:hugepage can compile... yes

--- MCA component mpool:memkind (m4 configuration macro)
checking for MCA component mpool:memkind compile mode... dso
checking --with-memkind value... simple ok (unspecified value)
checking memkind.h usability... no
checking memkind.h presence... Selecting previously unselected package ibverbs-providers:amd64.
Preparing to unpack .../55-ibverbs-providers_28.0-1ubuntu1_amd64.deb ...
Unpacking ibverbs-providers:amd64 (28.0-1ubuntu1) ...
no
checking for memkind.h... no
checking if MCA component mpool:memkind can compile... no

+++ Configuring MCA framework patcher
checking for no configure components in framework patcher... 
checking for m4 configure components in framework patcher... linux, overwrite

--- MCA component patcher:linux (m4 configuration macro)
checking for MCA component patcher:linux compile mode... dso
checking dlfcn.h usability... yes
checking dlfcn.h presence... yes
checking for dlfcn.h... yes
looking for library without search path
checking for library containing dl_iterate_phdr... Selecting previously unselected package libonig5:amd64.
Preparing to unpack .../56-libonig5_6.9.4-1_amd64.deb ...
Unpacking libonig5:amd64 (6.9.4-1) ...
Selecting previously unselected package libjq1:amd64.
Preparing to unpack .../57-libjq1_1.6-1ubuntu0.20.04.1_amd64.deb ...
Unpacking libjq1:amd64 (1.6-1ubuntu0.20.04.1) ...
none required
checking if libdl requires libnl v1 or v3... 
checking elf.h usability... yes
checking elf.h presence... yes
checking for elf.h... yes
checking sys/auxv.h usability... Selecting previously unselected package jq.
Preparing to unpack .../58-jq_1.6-1ubuntu0.20.04.1_amd64.deb ...
Unpacking jq (1.6-1ubuntu0.20.04.1) ...
yes
checking sys/auxv.h presence... yes
checking for sys/auxv.h... yes
checking if MCA component patcher:linux can compile... no

--- MCA component patcher:overwrite (m4 configuration macro)
checking for MCA component patcher:overwrite compile mode... dso
checking if MCA component patcher:overwrite can compile... yes

+++ Configuring MCA framework pmix
checking for no configure components in framework pmix... isolated
checking for m4 configure components in framework pmix... cray, ext1x, ext2x, ext3x, flux, pmix3x, s1, s2

--- MCA component pmix:isolated (no configuration)
checking for MCA component pmix:isolated compile mode... dso
checking if MCA component pmix:isolated can compile... yes

--- MCA component pmix:cray (m4 configuration macro)
checking for MCA component pmix:cray compile mode... dso
checking for Cray PMI support... checking for CRAY_PMI... no
checking for ALPS support cle level unknown... checking for CRAY_ALPSLLI... no
checking for CRAY_ALPSUTIL... no
checking for CRAY_ALPS... no
checking for CRAY_WLM_DETECT... no
opal_check_cray_alps_happy = no
checking if MCA component pmix:cray can compile... no

--- MCA component pmix:ext1x (m4 configuration macro)
checking for MCA component pmix:ext1x compile mode... dso
checking if MCA component pmix:ext1x can compile... no

--- MCA component pmix:ext2x (m4 configuration macro)
checking for MCA component pmix:ext2x compile mode... dso
checking if MCA component pmix:ext2x can compile... no

--- MCA component pmix:ext3x (m4 configuration macro)
checking for MCA component pmix:ext3x compile mode... dso
checking if MCA component pmix:ext3x can compile... no

--- MCA component pmix:flux (m4 configuration macro)
checking for MCA component pmix:flux compile mode... dso
checking if user wants Flux support to link against PMI library... no
checking if Flux support allowed to use dlopen... yes
checking Checking if Flux PMI support can be built... yes
checking if MCA component pmix:flux can compile... yes

--- MCA component pmix:pmix3x (m4 configuration macro)
checking for MCA component pmix:pmix3x compile mode... dso
checking if PMIx timing is enabled... no (disabled)
checking if want to install standalone libpmix... no
configure: OPAL configuring in opal/mca/pmix/pmix3x/pmix
Selecting previously unselected package libevent-2.1-7:amd64.
Preparing to unpack .../59-libevent-2.1-7_2.1.11-stable-1_amd64.deb ...
Unpacking libevent-2.1-7:amd64 (2.1.11-stable-1) ...
Selecting previously unselected package libevent-core-2.1-7:amd64.
Preparing to unpack .../60-libevent-core-2.1-7_2.1.11-stable-1_amd64.deb ...
Unpacking libevent-core-2.1-7:amd64 (2.1.11-stable-1) ...
configure: running /bin/bash '../../../../../../openmpi/opal/mca/pmix/pmix3x/pmix/configure' --with-pmix-symbol-rename=OPAL_MCA_PMIX3X_ --enable-embedded-mode --disable-debug --disable-pmix-timing --without-tests-examples --disable-pmix-binaries --disable-pmix-backward-compatibility --disable-visibility --enable-embedded-libevent --with-libevent-header=\"opal/mca/event/libevent2022/libevent2022.h\" --enable-embedded-hwloc --with-hwloc-header=\"opal/mca/hwloc/hwloc201/hwloc201.h\"  '--prefix=/opt/openmpi' '--with-cuda' 'CFLAGS=-w' 'CFLAGS=-O3 -DNDEBUG -w ' 'CPPFLAGS=-I/scratch/openmpi -I/scratch/build -I/scratch/openmpi/opal/include -I/scratch/build/opal/include -I/scratch/build/opal/mca/event/libevent2022/libevent/include -I/scratch/openmpi/opal/mca/event/libevent2022/libevent -I/scratch/openmpi/opal/mca/event/libevent2022/libevent/include -I/scratch/build/opal/mca/hwloc/hwloc201/hwloc/include -I/scratch/openmpi/opal/mca/hwloc/hwloc201/hwloc/include  ' --cache-file=/dev/null --srcdir=../../../../../../openmpi/opal/mca/pmix/pmix3x/pmix --disable-option-checking
Selecting previously unselected package libevent-pthreads-2.1-7:amd64.
Preparing to unpack .../61-libevent-pthreads-2.1-7_2.1.11-stable-1_amd64.deb ...
Unpacking libevent-pthreads-2.1-7:amd64 (2.1.11-stable-1) ...
configure: builddir: /scratch/build/opal/mca/pmix/pmix3x/pmix
configure: srcdir: /scratch/openmpi/opal/mca/pmix/pmix3x/pmix
configure: Detected VPATH build

============================================================================
== Configuring PMIx
============================================================================
checking build system type... x86_64-unknown-linux-gnu
checking host system type... x86_64-unknown-linux-gnu
checking target system type... x86_64-unknown-linux-gnu
checking for a BSD-compatible install... /usr/bin/install -c
checking whether build environment is sane... yes
checking for a thread-safe mkdir -p... /usr/bin/mkdir -p
checking for gawk... no
checking for mawk... mawk
checking whether make sets $(MAKE)... yes
Selecting previously unselected package libpsm-infinipath1.
Preparing to unpack .../62-libpsm-infinipath1_3.3+20.604758e7-6_amd64.deb ...
Unpacking libpsm-infinipath1 (3.3+20.604758e7-6) ...
checking whether make supports nested variables... yes
checking whether make supports nested variables... (cached) yes
checking for style of include used by make... GNU
checking for gcc... gcc
checking whether the C compiler works... yes
checking for C compiler default output file name... a.out
checking for suffix of executables... 
checking whether we are cross compiling... Selecting previously unselected package libpsm2-2.
Preparing to unpack .../63-libpsm2-2_11.2.86-1_amd64.deb ...
Unpacking libpsm2-2 (11.2.86-1) ...
no
checking for suffix of object files... o
checking whether we are using the GNU C compiler... yes
checking whether gcc accepts -g... yes
checking for gcc option to accept ISO C89... none needed
checking whether gcc understands -c and -o together... Selecting previously unselected package librdmacm1:amd64.
Preparing to unpack .../64-librdmacm1_28.0-1ubuntu1_amd64.deb ...
Unpacking librdmacm1:amd64 (28.0-1ubuntu1) ...
Selecting previously unselected package libfabric1.
Preparing to unpack .../65-libfabric1_1.6.2-3ubuntu0.1_amd64.deb ...
Unpacking libfabric1 (1.6.2-3ubuntu0.1) ...
yes
checking dependency style of gcc... gcc3
checking how to run the C preprocessor... gcc -E
checking for grep that handles long lines and -e... /usr/bin/grep
checking for egrep... /usr/bin/grep -E
checking for ANSI C header files... Selecting previously unselected package libltdl7:amd64.
Preparing to unpack .../66-libltdl7_2.4.6-14_amd64.deb ...
Unpacking libltdl7:amd64 (2.4.6-14) ...
yes
checking for sys/types.h... yes
checking for sys/stat.h... yes
checking for stdlib.h... Selecting previously unselected package libhwloc15:amd64.
Preparing to unpack .../67-libhwloc15_2.1.0+dfsg-4_amd64.deb ...
Unpacking libhwloc15:amd64 (2.1.0+dfsg-4) ...
yes
checking for string.h... yes
checking for memory.h... yes
checking for strings.h... yes
checking for inttypes.h... yes
checking for stdint.h... Selecting previously unselected package libxnvctrl0:amd64.
Preparing to unpack .../68-libxnvctrl0_550.54.15-0ubuntu1_amd64.deb ...
Unpacking libxnvctrl0:amd64 (550.54.15-0ubuntu1) ...
Selecting previously unselected package ocl-icd-libopencl1:amd64.
Preparing to unpack .../69-ocl-icd-libopencl1_2.2.11-1ubuntu1_amd64.deb ...
Unpacking ocl-icd-libopencl1:amd64 (2.2.11-1ubuntu1) ...
yes
checking for unistd.h... yes
checking minix/config.h usability... no
checking minix/config.h presence... no
checking for minix/config.h... no
checking whether it is safe to define __EXTENSIONS__... Selecting previously unselected package libhwloc-plugins:amd64.
Preparing to unpack .../70-libhwloc-plugins_2.1.0+dfsg-4_amd64.deb ...
Unpacking libhwloc-plugins:amd64 (2.1.0+dfsg-4) ...
yes
checking for ar... ar
checking the archiver (ar) interface... ar
checking for flex... no
checking for lex... no
checking if want dlopen support... yes
checking if embedded mode is enabled... yes
checking if want developer-level compiler pickyness... no
checking if want developer-level debugging code... no
checking if want to install project-internal header files... no
checking if tests and examples are to be installed... no
checking if want pretty-print stacktrace... yes
checking if want dstore pthread-based locking... yes
checking if want ident string... 
checking if want developer-level timing support... no
checking if want backward compatibility for PMI-1 and PMI-2... yes
checking if want to disable binaries... no
checking if want to support dlopen of non-global namespaces... yes
checking if want build psec/dummy_handshake... no
installing to directory "/opt/openmpi"
checking how to print strings... printf
checking for a sed that does not truncate output... /usr/bin/sed
checking for fgrep... /usr/bin/grep -F
checking for ld used by gcc... /usr/bin/ld
checking if the linker (/usr/bin/ld) is GNU ld... yes
checking for BSD- or MS-compatible name lister (nm)... /usr/bin/nm -B
checking the name lister (/usr/bin/nm -B) interface... BSD nm
checking whether ln -s works... yes
checking the maximum length of command line arguments... 1572864
checking how to convert x86_64-unknown-linux-gnu file names to x86_64-unknown-linux-gnu format... func_convert_file_noop
checking how to convert x86_64-unknown-linux-gnu file names to toolchain format... func_convert_file_noop
checking for /usr/bin/ld option to reload object files... -r
checking for objdump... objdump
checking how to recognize dependent libraries... pass_all
checking for dlltool... no
checking how to associate runtime and link libraries... printf %s\n
checking for archiver @FILE support... Selecting previously unselected package libpmix2:amd64.
Preparing to unpack .../71-libpmix2_3.1.5-1_amd64.deb ...
Unpacking libpmix2:amd64 (3.1.5-1) ...
@
checking for strip... strip
checking for ranlib... ranlib
checking command to parse /usr/bin/nm -B output from gcc object... ok
checking for sysroot... no
checking for a working dd... /usr/bin/dd
checking how to truncate binary pipes... /usr/bin/dd bs=4096 count=1
../../../../../../openmpi/opal/mca/pmix/pmix3x/pmix/configure: line 9027: /usr/bin/file: No such file or directory
checking for mt... no
checking if : is a manifest tool... no
checking for dlfcn.h... Selecting previously unselected package libopenmpi3:amd64.
Preparing to unpack .../72-libopenmpi3_4.0.3-0ubuntu1_amd64.deb ...
Unpacking libopenmpi3:amd64 (4.0.3-0ubuntu1) ...
yes
checking for objdir... .libs
checking if gcc supports -fno-rtti -fno-exceptions... yes
checking for gcc option to produce PIC... -fPIC -DPIC
checking if gcc PIC flag -fPIC -DPIC works... yes
checking if gcc static flag -static works... yes
checking if gcc supports -c -o file.o... yes
checking if gcc supports -c -o file.o... (cached) yes
checking whether the gcc linker (/usr/bin/ld) supports shared libraries... yes
checking whether -lc should be explicitly linked in... no
checking dynamic linker characteristics... GNU/Linux ld.so
checking how to hardcode library paths into programs... immediate
checking whether stripping libraries is possible... yes
checking if libtool supports shared libraries... yes
checking whether to build shared libraries... yes
checking whether to build static libraries... no

*** C compiler and preprocessor
checking for gcc... (cached) gcc
checking whether we are using the GNU C compiler... (cached) yes
checking whether gcc accepts -g... (cached) yes
checking for gcc option to accept ISO C89... (cached) none needed
checking whether gcc understands -c and -o together... (cached) yes
checking dependency style of gcc... (cached) gcc3
checking for BSD- or MS-compatible name lister (nm)... (cached) /usr/bin/nm -B
checking the name lister (/usr/bin/nm -B) interface... (cached) BSD nm
checking __NetBSD__... no
checking __FreeBSD__... no
checking __OpenBSD__... Selecting previously unselected package libcaf-openmpi-3:amd64.
Preparing to unpack .../73-libcaf-openmpi-3_2.8.0-1_amd64.deb ...
Unpacking libcaf-openmpi-3:amd64 (2.8.0-1) ...
Selecting previously unselected package libtdb1:amd64.
Preparing to unpack .../74-libtdb1_1.4.5-0ubuntu0.20.04.1_amd64.deb ...
Unpacking libtdb1:amd64 (1.4.5-0ubuntu0.20.04.1) ...
no
checking __DragonFly__... no
checking __386BSD__... no
checking __bsdi__... no
checking __APPLE__... no
checking __linux__... yes
checking __sun__... no
checking __sun... no
checking netdb.h usability... yes
checking netdb.h presence... yes
checking for netdb.h... yes
checking netinet/in.h usability... yes
checking netinet/in.h presence... yes
checking for netinet/in.h... yes
checking netinet/tcp.h usability... yes
checking netinet/tcp.h presence... yes
checking for netinet/tcp.h... yes
Selecting previously unselected package libvorbisfile3:amd64.
Preparing to unpack .../75-libvorbisfile3_1.3.6-2ubuntu1_amd64.deb ...
Unpacking libvorbisfile3:amd64 (1.3.6-2ubuntu1) ...
Selecting previously unselected package sound-theme-freedesktop.
Preparing to unpack .../76-sound-theme-freedesktop_0.8-2ubuntu1_all.deb ...
Unpacking sound-theme-freedesktop (0.8-2ubuntu1) ...
checking for struct sockaddr_in... yes
configure: pmix builddir: /scratch/build/opal/mca/pmix/pmix3x/pmix
configure: pmix srcdir: /scratch/openmpi/opal/mca/pmix/pmix3x/pmix
configure: Detected VPATH build
checking for pmix version... 3.1.4
checking if want pmix maintainer support... disabled
checking for pmix directory prefix... (none)
checking for symbol rename... OPAL_MCA_PMIX3X_
checking for extra lib... no
checking for extra ltlib... no
checking if want package/brand string... PMIx root@82e0ab48e4bf Distribution

============================================================================
== Compiler and preprocessor tests
============================================================================
checking if gcc requires a flag for C11... no
configure: verifying gcc supports C11 without a flag
checking if gcc  supports C11 _Thread_local... Selecting previously unselected package libcanberra0:amd64.
Preparing to unpack .../77-libcanberra0_0.30-7ubuntu1_amd64.deb ...
Unpacking libcanberra0:amd64 (0.30-7ubuntu1) ...
yes
checking if gcc  supports C11 atomic variables... yes
checking if gcc  supports C11 _Atomic keyword... yes
checking if gcc  supports C11 _Generic keyword... yes
Selecting previously unselected package libcoarrays-dev:amd64.
Preparing to unpack .../78-libcoarrays-dev_2.8.0-1_amd64.deb ...
Unpacking libcoarrays-dev:amd64 (2.8.0-1) ...
checking if gcc  supports C11 _Static_assert... yes
checking if gcc  supports C11 atomic_fetch_xor_explicit... yes
configure: no flag required for C11 support
checking if gcc  supports __thread... yes
checking if gcc  supports C11 _Thread_local... Selecting previously unselected package openmpi-common.
Preparing to unpack .../79-openmpi-common_4.0.3-0ubuntu1_all.deb ...
Unpacking openmpi-common (4.0.3-0ubuntu1) ...
yes
checking for the C compiler vendor... gnu
checking for ANSI C header files... (cached) yes
checking if gcc supports -finline-functions... yes
checking if gcc supports -fno-strict-aliasing... yes
Selecting previously unselected package openmpi-bin.
Preparing to unpack .../80-openmpi-bin_4.0.3-0ubuntu1_amd64.deb ...
Unpacking openmpi-bin (4.0.3-0ubuntu1) ...
configure: WARNING:  -fno-strict-aliasing has been added to CFLAGS
checking if gcc supports __builtin_expect... yes
checking if gcc supports __builtin_prefetch... yes
checking if gcc supports __builtin_clz... Selecting previously unselected package libcoarrays-openmpi-dev:amd64.
Preparing to unpack .../81-libcoarrays-openmpi-dev_2.8.0-1_amd64.deb ...
Unpacking libcoarrays-openmpi-dev:amd64 (2.8.0-1) ...
yes
checking for C optimization flags... -DNDEBUG -O3 -w -finline-functions -fno-strict-aliasing
checking for int8_t... yes
checking for uint8_t... Selecting previously unselected package libevent-extra-2.1-7:amd64.
Preparing to unpack .../82-libevent-extra-2.1-7_2.1.11-stable-1_amd64.deb ...
Unpacking libevent-extra-2.1-7:amd64 (2.1.11-stable-1) ...
yes
checking for int16_t... yes
checking for uint16_t... yes
Selecting previously unselected package libevent-openssl-2.1-7:amd64.
Preparing to unpack .../83-libevent-openssl-2.1-7_2.1.11-stable-1_amd64.deb ...
Unpacking libevent-openssl-2.1-7:amd64 (2.1.11-stable-1) ...
checking for int32_t... yes
checking for uint32_t... yes
checking for int64_t... yes
checking for uint64_t... yes
checking for __int128... yes
checking for uint128_t... no
checking for long long... yes
checking for intptr_t... yes
checking for uintptr_t... yes
checking for ptrdiff_t... yes
checking size of _Bool... 1
checking size of char... Selecting previously unselected package libevent-dev.
Preparing to unpack .../84-libevent-dev_2.1.11-stable-1_amd64.deb ...
Unpacking libevent-dev (2.1.11-stable-1) ...
Selecting previously unselected package libgpm2:amd64.
Preparing to unpack .../85-libgpm2_1.20.7-5_amd64.deb ...
Unpacking libgpm2:amd64 (1.20.7-5) ...
1
checking size of short... 2
checking size of int... Selecting previously unselected package libgtest-dev:amd64.
Preparing to unpack .../86-libgtest-dev_1.10.0-2_amd64.deb ...
Unpacking libgtest-dev:amd64 (1.10.0-2) ...
4
checking size of long... 8
checking size of long long... 8
checking size of float... 4
checking size of double... 8
checking size of void *... 8
checking size of size_t... 8
checking size of ptrdiff_t... 8
checking size of wchar_t... 4
checking size of pid_t... 4
checking alignment of bool... Selecting previously unselected package libltdl-dev:amd64.
Preparing to unpack .../87-libltdl-dev_2.4.6-14_amd64.deb ...
Unpacking libltdl-dev:amd64 (2.4.6-14) ...
1
checking alignment of int8_t... 1
checking alignment of int16_t... 2
checking alignment of int32_t... Selecting previously unselected package libnl-3-dev:amd64.
Preparing to unpack .../88-libnl-3-dev_3.4.0-1ubuntu0.1_amd64.deb ...
Unpacking libnl-3-dev:amd64 (3.4.0-1ubuntu0.1) ...
4
checking alignment of int64_t... 8
checking alignment of char... Selecting previously unselected package libnl-route-3-dev:amd64.
1
checking alignment of short... 2
checking alignment of wchar_t... 4
checking alignment of int... Preparing to unpack .../89-libnl-route-3-dev_3.4.0-1ubuntu0.1_amd64.deb ...
Unpacking libnl-route-3-dev:amd64 (3.4.0-1ubuntu0.1) ...
Selecting previously unselected package libpython3.8:amd64.
Preparing to unpack .../90-libpython3.8_3.8.10-0ubuntu1~20.04.9_amd64.deb ...
Unpacking libpython3.8:amd64 (3.8.10-0ubuntu1~20.04.9) ...
4
checking alignment of long... 8
checking alignment of long long... 8
checking alignment of float... 4
checking alignment of double... 8
checking alignment of void *... 8
checking alignment of size_t... 8
checking for C bool type... no
checking size of _Bool... (cached) 1
checking for inline... __inline__
Selecting previously unselected package libtool.
Preparing to unpack .../91-libtool_2.4.6-14_all.deb ...
Unpacking libtool (2.4.6-14) ...
Selecting previously unselected package ninja-build.
Preparing to unpack .../92-ninja-build_1.10.0-1build1_amd64.deb ...
Unpacking ninja-build (1.10.0-1build1) ...

*** Compiler characteristics
checking for __attribute__... yes
checking for __attribute__(aligned)... yes
checking for __attribute__(always_inline)... Selecting previously unselected package vim-runtime.
Preparing to unpack .../93-vim-runtime_2%3a8.1.2269-1ubuntu5.22_all.deb ...
Adding 'diversion of /usr/share/vim/vim81/doc/help.txt to /usr/share/vim/vim81/doc/help.txt.vim-tiny by vim-runtime'
Adding 'diversion of /usr/share/vim/vim81/doc/tags to /usr/share/vim/vim81/doc/tags.vim-tiny by vim-runtime'
Unpacking vim-runtime (2:8.1.2269-1ubuntu5.22) ...
yes
checking for __attribute__(cold)... yes
checking for __attribute__(const)... yes
checking for __attribute__(deprecated)... yes
checking for __attribute__(deprecated_argument)... yes
checking for __attribute__(format)... yes
checking for __attribute__(format_funcptr)... yes
checking for __attribute__(hot)... yes
checking for __attribute__(malloc)... yes
checking for __attribute__(may_alias)... yes
checking for __attribute__(no_instrument_function)... yes
checking for __attribute__(nonnull)... yes
checking for __attribute__(noreturn)... yes
checking for __attribute__(noreturn_funcptr)... yes
checking for __attribute__(packed)... yes
checking for __attribute__(pure)... yes
checking for __attribute__(sentinel)... yes
checking for __attribute__(unused)... yes
checking for __attribute__(visibility)... yes
checking for __attribute__(warn_unused_result)... yes
checking for __attribute__(destructor)... yes
checking for __attribute__(optnone)... yes
checking for __attribute__(extension)... yes
checking for compiler familyid... 0
checking for compiler familyname... UNKNOWN
checking for compiler version... 0
checking for compiler version_str... UNKNOWN

*** Assembler
checking dependency style of gcc... gcc3
checking for perl... /usr/bin/perl
checking for atomic_compare_exchange_strong_16... no
checking for atomic_compare_exchange_strong_16 with -mcx16... no
checking for atomic_compare_exchange_strong_16 with -latomic... no
checking for __sync_bool_compare_and_swap... no
checking for __sync_bool_compare_and_swap with -mcx16... yes
checking if __sync_bool_compare_and_swap() gives correct results... yes
checking for atomic_compare_exchange_strong_16... no
checking for atomic_compare_exchange_strong_16 with -mcx16... no
checking for atomic_compare_exchange_strong_16 with -latomic... no
checking for __sync_bool_compare_and_swap... yes
checking if __sync_bool_compare_and_swap() gives correct results... yes
checking if .proc/endp is needed... no
checking directive for setting text section... .text
checking directive for exporting symbols... .globl
checking for objdump... objdump
checking if .note.GNU-stack is needed... yes
checking suffix for labels... :
checking prefix for global symbol labels... 
checking prefix for lsym labels... .L
checking prefix for function in .type... @
checking if .size is needed... yes
checking if .align directive takes logarithmic value... no
checking for cmpxchg16b... yes
checking if cmpxchg16b() gives correct results... yes
checking if cmpxchg16b_result works... yes
checking if gcc supports GCC inline assembly... yes
checking for assembly format... default-.text-.globl-:--.L-@-1-0-1-1-1
checking for assembly architecture... X86_64
checking for builtin atomics... BUILTIN_C11

============================================================================
== Header file tests
============================================================================
checking arpa/inet.h usability... yes
checking arpa/inet.h presence... yes
checking for arpa/inet.h... yes
checking fcntl.h usability... yes
checking fcntl.h presence... yes
checking for fcntl.h... yes
checking ifaddrs.h usability... yes
checking ifaddrs.h presence... yes
checking for ifaddrs.h... yes
checking for inttypes.h... (cached) yes
checking libgen.h usability... yes
checking libgen.h presence... yes
checking for libgen.h... yes
checking net/uio.h usability... no
checking net/uio.h presence... no
checking for net/uio.h... no
checking for netinet/in.h... (cached) yes
checking for stdint.h... (cached) yes
checking stddef.h usability... yes
checking stddef.h presence... yes
checking for stddef.h... yes
checking for stdlib.h... (cached) yes
checking for string.h... (cached) yes
checking for strings.h... (cached) yes
checking sys/ioctl.h usability... yes
checking sys/ioctl.h presence... yes
checking for sys/ioctl.h... yes
checking sys/param.h usability... yes
checking sys/param.h presence... yes
checking for sys/param.h... yes
checking sys/select.h usability... yes
checking sys/select.h presence... yes
checking for sys/select.h... yes
checking sys/socket.h usability... yes
checking sys/socket.h presence... yes
checking for sys/socket.h... yes
checking sys/sockio.h usability... no
checking sys/sockio.h presence... no
checking for sys/sockio.h... no
checking stdarg.h usability... yes
checking stdarg.h presence... yes
checking for stdarg.h... yes
checking for sys/stat.h... (cached) yes
checking sys/time.h usability... yes
checking sys/time.h presence... yes
checking for sys/time.h... yes
checking for sys/types.h... (cached) yes
checking sys/un.h usability... yes
checking sys/un.h presence... yes
checking for sys/un.h... yes
checking sys/uio.h usability... yes
checking sys/uio.h presence... yes
checking for sys/uio.h... yes
checking sys/wait.h usability... yes
checking sys/wait.h presence... yes
checking for sys/wait.h... yes
checking syslog.h usability... yes
checking syslog.h presence... yes
checking for syslog.h... yes
checking time.h usability... yes
checking time.h presence... Selecting previously unselected package vim.
Preparing to unpack .../94-vim_2%3a8.1.2269-1ubuntu5.22_amd64.deb ...
Unpacking vim (2:8.1.2269-1ubuntu5.22) ...
Selecting previously unselected package libnuma-dev:amd64.
Preparing to unpack .../95-libnuma-dev_2.0.12-1_amd64.deb ...
Unpacking libnuma-dev:amd64 (2.0.12-1) ...
yes
checking for time.h... yes
checking for unistd.h... (cached) yes
checking dirent.h usability... yes
checking dirent.h presence... yes
checking for dirent.h... yes
checking crt_externs.h usability... no
checking crt_externs.h presence... no
checking for crt_externs.h... no
checking signal.h usability... yes
checking signal.h presence... Selecting previously unselected package libhwloc-dev:amd64.
Preparing to unpack .../96-libhwloc-dev_2.1.0+dfsg-4_amd64.deb ...
Unpacking libhwloc-dev:amd64 (2.1.0+dfsg-4) ...
yes
checking for signal.h... yes
checking ioLib.h usability... no
checking ioLib.h presence... no
checking for ioLib.h... no
checking sockLib.h usability... no
checking sockLib.h presence... no
checking for sockLib.h... no
checking hostLib.h usability... Selecting previously unselected package libibverbs-dev:amd64.
Preparing to unpack .../97-libibverbs-dev_28.0-1ubuntu1_amd64.deb ...
Unpacking libibverbs-dev:amd64 (28.0-1ubuntu1) ...
no
checking hostLib.h presence... no
checking for hostLib.h... no
checking limits.h usability... yes
checking limits.h presence... yes
checking for limits.h... yes
checking sys/statfs.h usability... yes
checking sys/statfs.h presence... yes
checking for sys/statfs.h... yes
Selecting previously unselected package libopenmpi-dev:amd64.
Preparing to unpack .../98-libopenmpi-dev_4.0.3-0ubuntu1_amd64.deb ...
Unpacking libopenmpi-dev:amd64 (4.0.3-0ubuntu1) ...
checking sys/statvfs.h usability... yes
checking sys/statvfs.h presence... yes
checking for sys/statvfs.h... yes
checking for netdb.h... (cached) yes
checking ucred.h usability... no
checking ucred.h presence... no
checking for ucred.h... no
checking zlib.h usability... no
checking zlib.h presence... no
checking for zlib.h... no
checking sys/auxv.h usability... yes
checking sys/auxv.h presence... yes
checking for sys/auxv.h... yes
checking sys/sysctl.h usability... yes
checking sys/sysctl.h presence... yes
checking for sys/sysctl.h... yes
checking for sys/mount.h... yes
checking for sys/sysctl.h... (cached) yes
checking for net/if.h... yes
checking stdbool.h usability... yes
checking stdbool.h presence... yes
checking for stdbool.h... yes
checking if <stdbool.h> works... yes

============================================================================
== Type tests
============================================================================
checking for socklen_t... yes
checking for struct sockaddr_in... (cached) yes
checking for struct sockaddr_un... yes
checking for struct sockaddr_in6... yes
checking for struct sockaddr_storage... yes
checking whether AF_UNSPEC is declared... yes
checking whether PF_UNSPEC is declared... yes
checking whether AF_INET6 is declared... yes
checking whether PF_INET6 is declared... yes
checking if SA_RESTART defined in signal.h... yes
checking for struct sockaddr.sa_len... no
checking for struct dirent.d_type... yes
checking for siginfo_t.si_fd... yes
checking for siginfo_t.si_band... yes
checking for struct statfs.f_type... yes
checking for struct statfs.f_fstypename... no
checking for struct statvfs.f_basetype... no
checking for struct statvfs.f_fstypename... no
checking for struct ucred.uid... yes
checking for struct ucred.cr_uid... no
checking for struct sockpeercred.uid... no
checking for pointer diff type... ptrdiff_t (size: 8)
checking the linker for support for the -fini option... Setting up libkeyutils1:amd64 (1.6-6ubuntu1.1) ...
Setting up libpsl5:amd64 (0.21.0-1ubuntu1) ...
Setting up libgpm2:amd64 (1.20.7-5) ...
Setting up mime-support (3.64ubuntu1) ...
yes

============================================================================
== Library and Function tests
============================================================================
checking for library containing openpty... -lutil
checking for library containing gethostbyname... Setting up wget (1.20.3-1ubuntu2) ...
Setting up libmagic-mgc (1:5.38-4) ...
Setting up libtdb1:amd64 (1.4.5-0ubuntu0.20.04.1) ...
Setting up libbrotli1:amd64 (1.0.7-6ubuntu0.1) ...
Setting up ccache (3.7.7-1) ...
Updating symlinks in /usr/lib/ccache ...
Setting up libnghttp2-14:amd64 (1.40.0-1ubuntu0.2) ...
none required
checking for library containing socket... none required
checking for library containing dirname... none required
checking for library containing ceil... Setting up libmagic1:amd64 (1:5.38-4) ...
Setting up less (551-1ubuntu0.2) ...
Setting up bc (1.07.1-2build1) ...
Setting up krb5-locales (1.17-6ubuntu4.4) ...
Setting up file (1:5.38-4) ...
Setting up libcbor0.6:amd64 (0.6.0-0ubuntu1) ...
Setting up googletest (1.10.0-2) ...
Setting up xxd (2:8.1.2269-1ubuntu5.22) ...
-lm
checking for library containing clock_gettime... none required
checking for asprintf... yes
Setting up ninja-build (1.10.0-1build1) ...
Setting up libkrb5support0:amd64 (1.17-6ubuntu4.4) ...
Setting up libxnvctrl0:amd64 (550.54.15-0ubuntu1) ...
Setting up tzdata (2024a-0ubuntu0.20.04) ...

Current default time zone: 'Etc/UTC'
checking for snprintf... yes
checking for vasprintf... yes
checking for vsnprintf... yes
checking for strsignal... Local time is now:      Thu Mar 21 19:42:34 UTC 2024.
Universal Time is now:  Thu Mar 21 19:42:34 UTC 2024.
Run 'dpkg-reconfigure tzdata' if you wish to change it.

Setting up liberror-perl (0.17029-1) ...
Setting up libvorbisfile3:amd64 (1.3.6-2ubuntu1) ...
Setting up autotools-dev (20180224.1) ...
Setting up vim-common (2:8.1.2269-1ubuntu5.22) ...
Setting up librtmp1:amd64 (2.4+20151223.gitfa8646d.1-2build1) ...
Setting up libsigsegv2:amd64 (2.12-2) ...
Setting up libevent-core-2.1-7:amd64 (2.1.11-stable-1) ...
Setting up libevent-2.1-7:amd64 (2.1.11-stable-1) ...
yes
checking for socketpair... yes
checking for strncpy_s... no
checking for usleep... Setting up gcc-8-base:amd64 (8.4.0-3ubuntu2) ...
Setting up libk5crypto3:amd64 (1.17-6ubuntu4.4) ...
Setting up libltdl7:amd64 (2.4.6-14) ...
Setting up libgfortran5:amd64 (10.5.0-1ubuntu1~20.04) ...
Setting up libmpx2:amd64 (8.4.0-3ubuntu2) ...
Setting up libnuma1:amd64 (2.0.12-1) ...
Setting up sound-theme-freedesktop (0.8-2ubuntu1) ...
Setting up ocl-icd-libopencl1:amd64 (2.2.11-1ubuntu1) ...
Setting up libnl-3-200:amd64 (3.4.0-1ubuntu0.1) ...
Setting up libpsm2-2 (11.2.86-1) ...
Setting up openmpi-common (4.0.3-0ubuntu1) ...
yes
checking for statfs... yes
checking for statvfs... yes
checking for getpeereid... Setting up git-man (1:2.25.1-1ubuntu3.11) ...
Setting up libkrb5-3:amd64 (1.17-6ubuntu4.4) ...
Setting up libpsm-infinipath1 (3.3+20.604758e7-6) ...
update-alternatives: using /usr/lib/libpsm1/libpsm_infinipath.so.1.16 to provide /usr/lib/x86_64-linux-gnu/libpsm_infinipath.so.1 (libpsm_infinipath.so.1) in auto mode
Setting up libmpdec2:amd64 (2.4.2-3) ...
Setting up cpp-8 (8.4.0-3ubuntu2) ...
Setting up vim-runtime (2:8.1.2269-1ubuntu5.22) ...
no
checking for getpeerucred... no
checking for strnlen... yes
checking for posix_fallocate... yes
checking for tcgetpgrp... Setting up libpython3.8-stdlib:amd64 (3.8.10-0ubuntu1~20.04.9) ...
Setting up libfido2-1:amd64 (1.3.1-1ubuntu2) ...
Setting up python3.8 (3.8.10-0ubuntu1~20.04.9) ...
yes
checking for setpgid... yes
checking for ptsname... yes
checking for openpty... yes
checking for setenv... yes
checking for htonl define... no
checking for htonl... yes
checking whether va_copy is declared... yes
checking whether __va_copy is declared... yes
checking whether __func__ is declared... yes

============================================================================
== System-specific tests
============================================================================
checking whether byte ordering is bigendian... no
checking for broken qsort... no
checking if C compiler and POSIX threads work as is... no
checking if C compiler and POSIX threads work with -Kthread... no
checking if C compiler and POSIX threads work with -kthread... no
checking if C compiler and POSIX threads work with -pthread... yes
checking for pthread_mutexattr_setpshared... yes
checking for pthread_condattr_setpshared... yes
checking for PTHREAD_MUTEX_ERRORCHECK_NP... yes
checking for PTHREAD_MUTEX_ERRORCHECK... yes
checking for working POSIX threads package... yes
checking if threads have different pids (pthreads on linux)... no
checking whether ln -s works... yes
checking for grep that handles long lines and -e... (cached) /usr/bin/grep
checking for egrep... (cached) /usr/bin/grep -E

============================================================================
== Symbol visibility feature
============================================================================
checking whether to enable symbol visibility... no (disabled)

============================================================================
== Libevent
============================================================================
checking --with-libev value... simple ok (unspecified)
checking --with-libev-libdir value... simple ok (unspecified)
checking will libev support be built... no
checking for libevent... assumed available (embedded mode)
checking libevent header... "opal/mca/event/libevent2022/libevent2022.h"
checking libevent2/thread header... "opal/mca/event/libevent2022/libevent2022.h"

============================================================================
== HWLOC
============================================================================
checking for hwloc... assumed available (embedded mode)
checking hwloc header... "opal/mca/hwloc/hwloc201/hwloc201.h"

============================================================================
== ZLIB
============================================================================
checking for zlib in... (default search paths)
looking for header without includes
checking zlib.h usability... no
checking zlib.h presence... no
checking for zlib.h... no
checking will zlib support be built... no

============================================================================
== Modular Component Architecture (MCA) setup
============================================================================
checking for default value of mca_base_component_show_load_errors... enabled by default
checking for subdir args...  '--with-pmix-symbol-rename=OPAL_MCA_PMIX3X_' '--enable-embedded-mode' '--disable-debug' '--disable-pmix-timing' '--without-tests-examples' '--disable-pmix-binaries' '--disable-pmix-backward-compatibility' '--disable-visibility' '--enable-embedded-libevent' '--with-libevent-header=opal/mca/event/libevent2022/libevent2022.h' '--enable-embedded-hwloc' '--with-hwloc-header=opal/mca/hwloc/hwloc201/hwloc201.h' '--prefix=/opt/openmpi' '--with-cuda' 'CFLAGS=-w' 'CFLAGS=-O3 -DNDEBUG -w ' 'CPPFLAGS=-I/scratch/openmpi -I/scratch/build -I/scratch/openmpi/opal/include -I/scratch/build/opal/include -I/scratch/build/opal/mca/event/libevent2022/libevent/include -I/scratch/openmpi/opal/mca/event/libevent2022/libevent -I/scratch/openmpi/opal/mca/event/libevent2022/libevent/include -I/scratch/build/opal/mca/hwloc/hwloc201/hwloc/include -I/scratch/openmpi/opal/mca/hwloc/hwloc201/hwloc/include  ' '--disable-option-checking'
checking which components should be disabled... 
checking which components should be direct-linked into the library... 
checking which components should be run-time loadable... all
checking which components should be static... none

*** Configuring MCA
checking for frameworks... common, bfrops, gds, pdl, pif, pinstalldirs, plog, pnet, preg, psec, psensor, pshmem, ptl

+++ Configuring MCA framework common
checking for no configure components in framework common... 
checking for m4 configure components in framework common... dstore

--- MCA component common:dstore (m4 configuration macro)
checking for MCA component common:dstore compile mode... dso
checking if MCA component common:dstore can compile... yes

+++ Configuring MCA framework bfrops
checking for no configure components in framework bfrops... v12, v20, v21, v3
checking for m4 configure components in framework bfrops... 

--- MCA component bfrops:v12 (no configuration)
checking for MCA component bfrops:v12 compile mode... dso
checking if MCA component bfrops:v12 can compile... yes

--- MCA component bfrops:v20 (no configuration)
checking for MCA component bfrops:v20 compile mode... dso
checking if MCA component bfrops:v20 can compile... yes

--- MCA component bfrops:v21 (no configuration)
checking for MCA component bfrops:v21 compile mode... dso
checking if MCA component bfrops:v21 can compile... yes

--- MCA component bfrops:v3 (no configuration)
checking for MCA component bfrops:v3 compile mode... dso
checking if MCA component bfrops:v3 can compile... yes

+++ Configuring MCA framework gds
checking for no configure components in framework gds... ds12, ds21, hash
checking for m4 configure components in framework gds... 

--- MCA component gds:ds12 (no configuration)
checking for MCA component gds:ds12 compile mode... dso
checking if MCA component gds:ds12 can compile... yes

--- MCA component gds:ds21 (no configuration)
checking for MCA component gds:ds21 compile mode... dso
checking if MCA component gds:ds21 can compile... yes

--- MCA component gds:hash (no configuration)
checking for MCA component gds:hash compile mode... dso
checking if MCA component gds:hash can compile... yes

+++ Configuring MCA framework pdl
checking for no configure components in framework pdl... 
checking for m4 configure components in framework pdl... pdlopen, plibltdl

--- MCA component pdl:pdlopen (m4 configuration macro, priority 80)
checking for MCA component pdl:pdlopen compile mode... static
looking for header without includes
checking dlfcn.h usability... yes
checking dlfcn.h presence... yes
checking for dlfcn.h... yes
looking for library without search path
checking for library containing dlopen... -ldl
checking if MCA component pdl:pdlopen can compile... yes

--- MCA component pdl:plibltdl (m4 configuration macro, priority 50)
checking for MCA component pdl:plibltdl compile mode... static
checking --with-plibltdl value... simple ok (unspecified)
checking --with-plibltdl-libdir value... simple ok (unspecified)
checking for libltdl dir... compiler default
checking for libltdl library dir... linker default
looking for header without includes
checking ltdl.h usability... no
checking ltdl.h presence... no
checking for ltdl.h... no
checking if MCA component pdl:plibltdl can compile... no

+++ Configuring MCA framework pif
checking for no configure components in framework pif... 
checking for m4 configure components in framework pif... bsdx_ipv4, bsdx_ipv6, linux_ipv6, posix_ipv4, solaris_ipv6

--- MCA component pif:bsdx_ipv4 (m4 configuration macro)
checking for MCA component pif:bsdx_ipv4 compile mode... static
checking struct sockaddr... yes (cached)
checking NetBSD, FreeBSD, OpenBSD, or DragonFly... no
checking if MCA component pif:bsdx_ipv4 can compile... no

--- MCA component pif:bsdx_ipv6 (m4 configuration macro)
checking for MCA component pif:bsdx_ipv6 compile mode... static
checking struct sockaddr... yes (cached)
checking some flavor of BSD... no
checking if MCA component pif:bsdx_ipv6 can compile... no

--- MCA component pif:linux_ipv6 (m4 configuration macro)
checking for MCA component pif:linux_ipv6 compile mode... static
checking if we are on Linux with TCP... yes
checking if MCA component pif:linux_ipv6 can compile... yes

--- MCA component pif:posix_ipv4 (m4 configuration macro)
checking for MCA component pif:posix_ipv4 compile mode... static
checking struct sockaddr... yes (cached)
checking not NetBSD, FreeBSD, OpenBSD, or DragonFly... yes
checking for struct ifreq.ifr_hwaddr... yes
checking for struct ifreq.ifr_mtu... Setting up publicsuffix (20200303.0012-1) ...
Setting up libxmuu1:amd64 (2:1.1.3-0ubuntu1) ...
Setting up libonig5:amd64 (6.9.4-1) ...
Setting up libpython3-stdlib:amd64 (3.8.2-0ubuntu2) ...
Setting up libevent-pthreads-2.1-7:amd64 (2.1.11-stable-1) ...
Setting up libcanberra0:amd64 (0.30-7ubuntu1) ...
Setting up libevent-extra-2.1-7:amd64 (2.1.11-stable-1) ...
Setting up libgtest-dev:amd64 (1.10.0-2) ...
Setting up libtool (2.4.6-14) ...
Setting up libjq1:amd64 (1.6-1ubuntu0.20.04.1) ...
Setting up libicu66:amd64 (66.1-2ubuntu2.1) ...
Setting up libgfortran-9-dev:amd64 (9.4.0-1ubuntu1~20.04.2) ...
Setting up libevent-openssl-2.1-7:amd64 (2.1.11-stable-1) ...
Setting up m4 (1.4.18-4) ...
Setting up python3 (3.8.2-0ubuntu2) ...
yes
checking if MCA component pif:posix_ipv4 can compile... yes

--- MCA component pif:solaris_ipv6 (m4 configuration macro)
checking for MCA component pif:solaris_ipv6 compile mode... static
checking if MCA component pif:solaris_ipv6 can compile... no

+++ Configuring MCA framework pinstalldirs
checking for no configure components in framework pinstalldirs... 
checking for m4 configure components in framework pinstalldirs... config, env

--- MCA component pinstalldirs:env (m4 configuration macro, priority 10)
checking for MCA component pinstalldirs:env compile mode... static
checking if MCA component pinstalldirs:env can compile... yes

--- MCA component pinstalldirs:config (m4 configuration macro, priority 0)
checking for MCA component pinstalldirs:config compile mode... static
checking if MCA component pinstalldirs:config can compile... yes

+++ Configuring MCA framework plog
checking for no configure components in framework plog... default, stdfd
checking for m4 configure components in framework plog... syslog

--- MCA component plog:default (no configuration)
checking for MCA component plog:default compile mode... dso
checking if MCA component plog:default can compile... yes

--- MCA component plog:stdfd (no configuration)
checking for MCA component plog:stdfd compile mode... dso
checking if MCA component plog:stdfd can compile... yes

--- MCA component plog:syslog (m4 configuration macro)
checking for MCA component plog:syslog compile mode... dso
checking for syslog.h... (cached) yes
checking if MCA component plog:syslog can compile... yes

+++ Configuring MCA framework pnet
checking for no configure components in framework pnet... tcp, test
checking for m4 configure components in framework pnet... opa

--- MCA component pnet:tcp (no configuration)
checking for MCA component pnet:tcp compile mode... dso
checking if MCA component pnet:tcp can compile... yes

--- MCA component pnet:test (no configuration)
checking for MCA component pnet:test compile mode... dso
checking if MCA component pnet:test can compile... yes

--- MCA component pnet:opa (m4 configuration macro)
Setting up libnuma-dev:amd64 (2.0.12-1) ...
checking for MCA component pnet:opa compile mode... dso
checking --with-psm2 value... simple ok (unspecified)
checking --with-psm2-libdir value... simple ok (unspecified)
looking for header without includes
checking psm2.h usability... no
checking psm2.h presence... no
checking for psm2.h... no
checking if opamgt requested... no
checking if MCA component pnet:opa can compile... no

+++ Configuring MCA framework preg
checking for no configure components in framework preg... native
checking for m4 configure components in framework preg... 

--- MCA component preg:native (no configuration)
checking for MCA component preg:native compile mode... dso
checking if MCA component preg:native can compile... yes

+++ Configuring MCA framework psec
checking for no configure components in framework psec... dummy_handshake, native, none
checking for m4 configure components in framework psec... munge

--- MCA component psec:dummy_handshake (no configuration)
checking for MCA component psec:dummy_handshake compile mode... dso
checking if MCA component psec:dummy_handshake can compile... no

--- MCA component psec:native (no configuration)
checking for MCA component psec:native compile mode... dso
checking if MCA component psec:native can compile... yes

--- MCA component psec:none (no configuration)
checking for MCA component psec:none compile mode... dso
checking if MCA component psec:none can compile... yes

--- MCA component psec:munge (m4 configuration macro)
checking for MCA component psec:munge compile mode... dso
checking will munge support be built... no
checking if MCA component psec:munge can compile... no

+++ Configuring MCA framework psensor
Setting up libnl-route-3-200:amd64 (3.4.0-1ubuntu0.1) ...
Setting up libpython3.8:amd64 (3.8.10-0ubuntu1~20.04.9) ...
Setting up libevent-dev (2.1.11-stable-1) ...
Setting up libhwloc15:amd64 (2.1.0+dfsg-4) ...
Setting up libgssapi-krb5-2:amd64 (1.17-6ubuntu4.4) ...
Setting up libgcc-8-dev:amd64 (8.4.0-3ubuntu2) ...
Setting up libssh-4:amd64 (0.9.3-2ubuntu2.5) ...
Setting up autoconf (2.69-11.1) ...
checking for no configure components in framework psensor... file, heartbeat
checking for m4 configure components in framework psensor... 

--- MCA component psensor:file (no configuration)
checking for MCA component psensor:file compile mode... dso
checking if MCA component psensor:file can compile... yes

--- MCA component psensor:heartbeat (no configuration)
checking for MCA component psensor:heartbeat compile mode... dso
checking if MCA component psensor:heartbeat can compile... yes

+++ Configuring MCA framework pshmem
checking for no configure components in framework pshmem... mmap
checking for m4 configure components in framework pshmem... 

--- MCA component pshmem:mmap (no configuration)
checking for MCA component pshmem:mmap compile mode... dso
checking if MCA component pshmem:mmap can compile... yes

+++ Configuring MCA framework ptl
checking for no configure components in framework ptl... tcp, usock
checking for m4 configure components in framework ptl... 

--- MCA component ptl:tcp (no configuration)
checking for MCA component ptl:tcp compile mode... dso
checking if MCA component ptl:tcp can compile... yes

--- MCA component ptl:usock (no configuration)
checking for MCA component ptl:usock compile mode... dso
checking if MCA component ptl:usock can compile... yes

============================================================================
== Dstore Locking
============================================================================
checking for struct flock.l_type... yes
checking for pthread_rwlockattr_setkind_np... Setting up libnl-3-dev:amd64 (3.4.0-1ubuntu0.1) ...
Setting up xauth (1:1.1-0ubuntu1) ...
Setting up jq (1.6-1ubuntu0.20.04.1) ...
Setting up libxml2:amd64 (2.9.10+dfsg-5ubuntu0.20.04.7) ...
Setting up automake (1:1.16.1-4ubuntu6) ...
update-alternatives: using /usr/bin/automake-1.16 to provide /usr/bin/automake (automake) in auto mode
update-alternatives: warning: skip creation of /usr/share/man/man1/automake.1.gz because associated file /usr/share/man/man1/automake-1.16.1.gz (of link group automake) doesn't exist
update-alternatives: warning: skip creation of /usr/share/man/man1/aclocal.1.gz because associated file /usr/share/man/man1/aclocal-1.16.1.gz (of link group automake) doesn't exist
Setting up libgfortran-8-dev:amd64 (8.4.0-3ubuntu2) ...
Setting up libibverbs1:amd64 (28.0-1ubuntu1) ...
yes
checking for pthread_rwlockattr_setpshared... yes

*** Set path-related compiler flags

*** Final output
Setting up gcc-8 (8.4.0-3ubuntu2) ...
Setting up vim (2:8.1.2269-1ubuntu5.22) ...
update-alternatives: using /usr/bin/vim.basic to provide /usr/bin/vim (vim) in auto mode
update-alternatives: using /usr/bin/vim.basic to provide /usr/bin/vimdiff (vimdiff) in auto mode
update-alternatives: using /usr/bin/vim.basic to provide /usr/bin/rvim (rvim) in auto mode
update-alternatives: using /usr/bin/vim.basic to provide /usr/bin/rview (rview) in auto mode
update-alternatives: using /usr/bin/vim.basic to provide /usr/bin/vi (vi) in auto mode
update-alternatives: warning: skip creation of /usr/share/man/da/man1/vi.1.gz because associated file /usr/share/man/da/man1/vim.1.gz (of link group vi) doesn't exist
update-alternatives: warning: skip creation of /usr/share/man/de/man1/vi.1.gz because associated file /usr/share/man/de/man1/vim.1.gz (of link group vi) doesn't exist
update-alternatives: warning: skip creation of /usr/share/man/fr/man1/vi.1.gz because associated file /usr/share/man/fr/man1/vim.1.gz (of link group vi) doesn't exist
update-alternatives: warning: skip creation of /usr/share/man/it/man1/vi.1.gz because associated file /usr/share/man/it/man1/vim.1.gz (of link group vi) doesn't exist
update-alternatives: warning: skip creation of /usr/share/man/ja/man1/vi.1.gz because associated file /usr/share/man/ja/man1/vim.1.gz (of link group vi) doesn't exist
update-alternatives: warning: skip creation of /usr/share/man/pl/man1/vi.1.gz because associated file /usr/share/man/pl/man1/vim.1.gz (of link group vi) doesn't exist

============================================================================
== Final compiler flags
============================================================================
checking final CPPFLAGS... -I/scratch/build/opal/mca/pmix/pmix3x/pmix -I/scratch/openmpi/opal/mca/pmix/pmix3x/pmix -I/scratch/openmpi/opal/mca/pmix/pmix3x/pmix/src -I/scratch/build/opal/mca/pmix/pmix3x/pmix/include -I/scratch/openmpi/opal/mca/pmix/pmix3x/pmix/include -I/scratch/openmpi -I/scratch/build -I/scratch/openmpi/opal/include -I/scratch/build/opal/include -I/scratch/build/opal/mca/event/libevent2022/libevent/include -I/scratch/openmpi/opal/mca/event/libevent2022/libevent -I/scratch/openmpi/opal/mca/event/libevent2022/libevent/include -I/scratch/build/opal/mca/hwloc/hwloc201/hwloc/include -I/scratch/openmpi/opal/mca/hwloc/hwloc201/hwloc/include
checking final CFLAGS... -DNDEBUG -O3 -w -finline-functions -fno-strict-aliasing -mcx16 -pthread
checking final LDFLAGS... 
checking final LIBS... -lm -lutil -ldl

============================================================================
== Configuration complete
============================================================================
checking that generated files are newer than configure... done
configure: creating ./config.status
update-alternatives: warning: skip creation of /usr/share/man/ru/man1/vi.1.gz because associated file /usr/share/man/ru/man1/vim.1.gz (of link group vi) doesn't exist
update-alternatives: warning: skip creation of /usr/share/man/man1/vi.1.gz because associated file /usr/share/man/man1/vim.1.gz (of link group vi) doesn't exist
update-alternatives: using /usr/bin/vim.basic to provide /usr/bin/view (view) in auto mode
update-alternatives: warning: skip creation of /usr/share/man/da/man1/view.1.gz because associated file /usr/share/man/da/man1/vim.1.gz (of link group view) doesn't exist
update-alternatives: warning: skip creation of /usr/share/man/de/man1/view.1.gz because associated file /usr/share/man/de/man1/vim.1.gz (of link group view) doesn't exist
update-alternatives: warning: skip creation of /usr/share/man/fr/man1/view.1.gz because associated file /usr/share/man/fr/man1/vim.1.gz (of link group view) doesn't exist
update-alternatives: warning: skip creation of /usr/share/man/it/man1/view.1.gz because associated file /usr/share/man/it/man1/vim.1.gz (of link group view) doesn't exist
update-alternatives: warning: skip creation of /usr/share/man/ja/man1/view.1.gz because associated file /usr/share/man/ja/man1/vim.1.gz (of link group view) doesn't exist
update-alternatives: warning: skip creation of /usr/share/man/pl/man1/view.1.gz because associated file /usr/share/man/pl/man1/vim.1.gz (of link group view) doesn't exist
update-alternatives: warning: skip creation of /usr/share/man/ru/man1/view.1.gz because associated file /usr/share/man/ru/man1/vim.1.gz (of link group view) doesn't exist
update-alternatives: warning: skip creation of /usr/share/man/man1/view.1.gz because associated file /usr/share/man/man1/vim.1.gz (of link group view) doesn't exist
update-alternatives: using /usr/bin/vim.basic to provide /usr/bin/ex (ex) in auto mode
update-alternatives: warning: skip creation of /usr/share/man/da/man1/ex.1.gz because associated file /usr/share/man/da/man1/vim.1.gz (of link group ex) doesn't exist
update-alternatives: warning: skip creation of /usr/share/man/de/man1/ex.1.gz because associated file /usr/share/man/de/man1/vim.1.gz (of link group ex) doesn't exist
update-alternatives: warning: skip creation of /usr/share/man/fr/man1/ex.1.gz because associated file /usr/share/man/fr/man1/vim.1.gz (of link group ex) doesn't exist
update-alternatives: warning: skip creation of /usr/share/man/it/man1/ex.1.gz because associated file /usr/share/man/it/man1/vim.1.gz (of link group ex) doesn't exist
update-alternatives: warning: skip creation of /usr/share/man/ja/man1/ex.1.gz because associated file /usr/share/man/ja/man1/vim.1.gz (of link group ex) doesn't exist
update-alternatives: warning: skip creation of /usr/share/man/pl/man1/ex.1.gz because associated file /usr/share/man/pl/man1/vim.1.gz (of link group ex) doesn't exist
update-alternatives: warning: skip creation of /usr/share/man/ru/man1/ex.1.gz because associated file /usr/share/man/ru/man1/vim.1.gz (of link group ex) doesn't exist
update-alternatives: warning: skip creation of /usr/share/man/man1/ex.1.gz because associated file /usr/share/man/man1/vim.1.gz (of link group ex) doesn't exist
update-alternatives: using /usr/bin/vim.basic to provide /usr/bin/editor (editor) in auto mode
update-alternatives: warning: skip creation of /usr/share/man/da/man1/editor.1.gz because associated file /usr/share/man/da/man1/vim.1.gz (of link group editor) doesn't exist
update-alternatives: warning: skip creation of /usr/share/man/de/man1/editor.1.gz because associated file /usr/share/man/de/man1/vim.1.gz (of link group editor) doesn't exist
update-alternatives: warning: skip creation of /usr/share/man/fr/man1/editor.1.gz because associated file /usr/share/man/fr/man1/vim.1.gz (of link group editor) doesn't exist
update-alternatives: warning: skip creation of /usr/share/man/it/man1/editor.1.gz because associated file /usr/share/man/it/man1/vim.1.gz (of link group editor) doesn't exist
update-alternatives: warning: skip creation of /usr/share/man/ja/man1/editor.1.gz because associated file /usr/share/man/ja/man1/vim.1.gz (of link group editor) doesn't exist
update-alternatives: warning: skip creation of /usr/share/man/pl/man1/editor.1.gz because associated file /usr/share/man/pl/man1/vim.1.gz (of link group editor) doesn't exist
update-alternatives: warning: skip creation of /usr/share/man/ru/man1/editor.1.gz because associated file /usr/share/man/ru/man1/vim.1.gz (of link group editor) doesn't exist
update-alternatives: warning: skip creation of /usr/share/man/man1/editor.1.gz because associated file /usr/share/man/man1/vim.1.gz (of link group editor) doesn't exist
Setting up gfortran-9 (9.4.0-1ubuntu1~20.04.2) ...
Setting up ibverbs-providers:amd64 (28.0-1ubuntu1) ...
Setting up openssh-client (1:8.2p1-4ubuntu0.11) ...
Setting up gfortran-8 (8.4.0-3ubuntu2) ...
Setting up libcurl3-gnutls:amd64 (7.68.0-1ubuntu2.21) ...
Setting up libhwloc-plugins:amd64 (2.1.0+dfsg-4) ...
Setting up gfortran (4:9.3.0-1ubuntu2) ...
update-alternatives: using /usr/bin/gfortran to provide /usr/bin/f95 (f95) in auto mode
update-alternatives: warning: skip creation of /usr/share/man/man1/f95.1.gz because associated file /usr/share/man/man1/gfortran.1.gz (of link group f95) doesn't exist
update-alternatives: using /usr/bin/gfortran to provide /usr/bin/f77 (f77) in auto mode
update-alternatives: warning: skip creation of /usr/share/man/man1/f77.1.gz because associated file /usr/share/man/man1/gfortran.1.gz (of link group f77) doesn't exist
Setting up libnl-route-3-dev:amd64 (3.4.0-1ubuntu0.1) ...
Setting up libltdl-dev:amd64 (2.4.6-14) ...
config.status: creating include/pmix_version.h
config.status: creating include/pmix_rename.h
config.status: creating src/mca/common/Makefile
Setting up git (1:2.25.1-1ubuntu3.11) ...
Setting up libhwloc-dev:amd64 (2.1.0+dfsg-4) ...
Setting up librdmacm1:amd64 (28.0-1ubuntu1) ...
Setting up libpmix2:amd64 (3.1.5-1) ...
Setting up libcoarrays-dev:amd64 (2.8.0-1) ...
Setting up libibverbs-dev:amd64 (28.0-1ubuntu1) ...
config.status: creating src/mca/common/dstore/Makefile
config.status: creating src/mca/bfrops/Makefile
config.status: creating src/mca/bfrops/v12/Makefile
config.status: creating src/mca/bfrops/v20/Makefile
config.status: creating src/mca/bfrops/v21/Makefile
config.status: creating src/mca/bfrops/v3/Makefile
config.status: creating src/mca/gds/Makefile
config.status: creating src/mca/gds/ds12/Makefile
config.status: creating src/mca/gds/ds21/Makefile
config.status: creating src/mca/gds/hash/Makefile
config.status: creating src/mca/pdl/Makefile
config.status: creating src/mca/pdl/pdlopen/Makefile
Setting up libfabric1 (1.6.2-3ubuntu0.1) ...
Setting up libopenmpi3:amd64 (4.0.3-0ubuntu1) ...
Setting up libcaf-openmpi-3:amd64 (2.8.0-1) ...
Setting up openmpi-bin (4.0.3-0ubuntu1) ...
update-alternatives: using /usr/bin/mpirun.openmpi to provide /usr/bin/mpirun (mpirun) in auto mode
update-alternatives: warning: skip creation of /usr/share/man/man1/mpirun.1.gz because associated file /usr/share/man/man1/mpirun.openmpi.1.gz (of link group mpirun) doesn't exist
update-alternatives: warning: skip creation of /usr/share/man/man1/mpiexec.1.gz because associated file /usr/share/man/man1/mpiexec.openmpi.1.gz (of link group mpirun) doesn't exist
update-alternatives: using /usr/bin/mpicc.openmpi to provide /usr/bin/mpicc (mpi) in auto mode
update-alternatives: warning: skip creation of /usr/share/man/man1/mpicc.1.gz because associated file /usr/share/man/man1/mpicc.openmpi.1.gz (of link group mpi) doesn't exist
update-alternatives: warning: skip creation of /usr/share/man/man1/mpic++.1.gz because associated file /usr/share/man/man1/mpic++.openmpi.1.gz (of link group mpi) doesn't exist
update-alternatives: warning: skip creation of /usr/share/man/man1/mpicxx.1.gz because associated file /usr/share/man/man1/mpicxx.openmpi.1.gz (of link group mpi) doesn't exist
update-alternatives: warning: skip creation of /usr/share/man/man1/mpiCC.1.gz because associated file /usr/share/man/man1/mpiCC.openmpi.1.gz (of link group mpi) doesn't exist
update-alternatives: warning: skip creation of /usr/share/man/man1/mpif77.1.gz because associated file /usr/share/man/man1/mpif77.openmpi.1.gz (of link group mpi) doesn't exist
update-alternatives: warning: skip creation of /usr/share/man/man1/mpif90.1.gz because associated file /usr/share/man/man1/mpif90.openmpi.1.gz (of link group mpi) doesn't exist
update-alternatives: warning: skip creation of /usr/share/man/man1/mpifort.1.gz because associated file /usr/share/man/man1/mpifort.openmpi.1.gz (of link group mpi) doesn't exist
Setting up libcoarrays-openmpi-dev:amd64 (2.8.0-1) ...
Setting up libopenmpi-dev:amd64 (4.0.3-0ubuntu1) ...
config.status: creating src/mca/pdl/plibltdl/Makefile
config.status: creating src/mca/pif/Makefile
config.status: creating src/mca/pif/bsdx_ipv4/Makefile
config.status: creating src/mca/pif/bsdx_ipv6/Makefile
config.status: creating src/mca/pif/linux_ipv6/Makefile
config.status: creating src/mca/pif/posix_ipv4/Makefile
config.status: creating src/mca/pif/solaris_ipv6/Makefile
config.status: creating src/mca/pinstalldirs/Makefile
config.status: creating src/mca/pinstalldirs/env/Makefile
config.status: creating src/mca/pinstalldirs/config/Makefile
config.status: creating src/mca/pinstalldirs/config/pinstall_dirs.h
config.status: creating src/mca/plog/Makefile
update-alternatives: using /usr/lib/x86_64-linux-gnu/openmpi/include to provide /usr/include/x86_64-linux-gnu/mpi (mpi-x86_64-linux-gnu) in auto mode
Processing triggers for libc-bin (2.31-0ubuntu9.12) ...
config.status: creating src/mca/plog/default/Makefile
config.status: creating src/mca/plog/stdfd/Makefile
config.status: creating src/mca/plog/syslog/Makefile
config.status: creating src/mca/pnet/Makefile
config.status: creating src/mca/pnet/tcp/Makefile
config.status: creating src/mca/pnet/test/Makefile
config.status: creating src/mca/pnet/opa/Makefile
config.status: creating src/mca/preg/Makefile
config.status: creating src/mca/preg/native/Makefile
config.status: creating src/mca/psec/Makefile
config.status: creating src/mca/psec/dummy_handshake/Makefile
config.status: creating src/mca/psec/native/Makefile
config.status: creating src/mca/psec/none/Makefile
config.status: creating src/mca/psec/munge/Makefile
config.status: creating src/mca/psensor/Makefile
config.status: creating src/mca/psensor/file/Makefile
config.status: creating src/mca/psensor/heartbeat/Makefile
config.status: creating src/mca/pshmem/Makefile
config.status: creating src/mca/pshmem/mmap/Makefile
config.status: creating src/mca/ptl/Makefile
config.status: creating src/mca/ptl/tcp/Makefile
config.status: creating src/mca/ptl/usock/Makefile
config.status: creating test/run_tests00.pl
config.status: creating test/run_tests01.pl
config.status: creating test/run_tests02.pl
config.status: creating test/run_tests03.pl
config.status: creating test/run_tests04.pl
config.status: creating test/run_tests05.pl
config.status: creating test/run_tests06.pl
config.status: creating test/run_tests07.pl
config.status: creating test/run_tests08.pl
config.status: creating test/run_tests09.pl
config.status: creating test/run_tests10.pl
config.status: creating test/run_tests11.pl
config.status: creating test/run_tests12.pl
config.status: creating test/run_tests13.pl
config.status: creating test/run_tests14.pl
config.status: creating test/run_tests15.pl
config.status: creating Makefile
config.status: creating config/Makefile
config.status: creating etc/Makefile
config.status: creating include/Makefile
config.status: creating src/Makefile
config.status: creating src/util/keyval/Makefile
config.status: creating src/mca/base/Makefile
config.status: creating src/tools/pevent/Makefile
config.status: creating src/tools/pmix_info/Makefile
config.status: creating src/tools/plookup/Makefile
config.status: creating src/tools/pps/Makefile
config.status: creating contrib/Makefile
config.status: creating examples/Makefile
config.status: creating test/Makefile
config.status: creating test/simple/Makefile
config.status: creating src/include/pmix_config.h
config.status: creating include/pmix_common.h
config.status: executing depfiles commands
config.status: executing libtool commands

PMIx configuration:
-----------------------
Version: 3.1.4
Debug build: no
Platform file: (none)

External Packages
-----------------------
HWLOC: yes (embedded)
Libevent: yes (embedded)
 
configure: /bin/bash '../../../../../../openmpi/opal/mca/pmix/pmix3x/pmix/configure' succeeded for opal/mca/pmix/pmix3x/pmix
checking if v3.x component is to be used... yes - using the internal v3.x library
checking PMIx extra wrapper CPPFLAGS... 
checking PMIx extra wrapper LDFLAGS... 
checking PMIx extra wrapper LIBS... 
checking if MCA component pmix:pmix3x can compile... yes

--- MCA component pmix:s1 (m4 configuration macro)
checking for MCA component pmix:s1 compile mode... dso
checking if MCA component pmix:s1 can compile... no

--- MCA component pmix:s2 (m4 configuration macro)
checking for MCA component pmix:s2 compile mode... dso
checking if MCA component pmix:s2 can compile... no

+++ Configuring MCA framework pstat
checking for no configure components in framework pstat... 
checking for m4 configure components in framework pstat... linux, test

--- MCA component pstat:linux (m4 configuration macro, priority 60)
checking for MCA component pstat:linux compile mode... dso
checking whether HZ is declared... yes
checking if MCA component pstat:linux can compile... yes

--- MCA component pstat:test (m4 configuration macro, priority 10)
checking for MCA component pstat:test compile mode... dso
checking if MCA component pstat:test can compile... no

+++ Configuring MCA framework rcache
checking for no configure components in framework rcache... grdma
checking for m4 configure components in framework rcache... gpusm, rgpusm, udreg

--- MCA component rcache:grdma (no configuration)
checking for MCA component rcache:grdma compile mode... dso
checking if MCA component rcache:grdma can compile... yes

--- MCA component rcache:gpusm (m4 configuration macro)
checking for MCA component rcache:gpusm compile mode... dso
checking if MCA component rcache:gpusm can compile... yes

--- MCA component rcache:rgpusm (m4 configuration macro)
checking for MCA component rcache:rgpusm compile mode... dso
checking if MCA component rcache:rgpusm can compile... yes

--- MCA component rcache:udreg (m4 configuration macro)
checking for MCA component rcache:udreg compile mode... dso
checking for CRAY_UDREG... no
no
checking if MCA component rcache:udreg can compile... no

+++ Configuring MCA framework reachable
checking for no configure components in framework reachable... weighted
checking for m4 configure components in framework reachable... netlink

--- MCA component reachable:weighted (no configuration)
checking for MCA component reachable:weighted compile mode... dso
checking if MCA component reachable:weighted can compile... yes

--- MCA component reachable:netlink (m4 configuration macro)
checking for MCA component reachable:netlink compile mode... dso
checking for linux/netlink.h... yes
configure: checking for libnl v3
checking for /usr/include/libnl3... not found
checking for /usr/local/include/libnl3... not found
checking if MCA component reachable:netlink can compile... no

+++ Configuring MCA framework shmem
checking for no configure components in framework shmem... 
checking for m4 configure components in framework shmem... mmap, posix, sysv

--- MCA component shmem:mmap (m4 configuration macro)
checking for MCA component shmem:mmap compile mode... dso
checking if want mmap shared memory support... yes
checking for library containing mmap... none required
checking if MCA component shmem:mmap can compile... yes

--- MCA component shmem:posix (m4 configuration macro)
checking for MCA component shmem:posix compile mode... dso
checking if want POSIX shared memory support... yes
checking for library containing shm_open... -lrt
checking if MCA component shmem:posix can compile... yes

--- MCA component shmem:sysv (m4 configuration macro)
checking for MCA component shmem:sysv compile mode... dso
checking if want SYSV shared memory support... yes
checking for shmget... yes
checking if MCA component shmem:sysv can compile... yes

+++ Configuring MCA framework timer
checking for no configure components in framework timer... 
checking for m4 configure components in framework timer... altix, darwin, linux, solaris

--- MCA component timer:altix (m4 configuration macro, priority 100)
checking for MCA component timer:altix compile mode... static
checking sn/mmtimer.h usability... no
checking sn/mmtimer.h presence... no
checking for sn/mmtimer.h... no
checking if MCA component timer:altix can compile... no

--- MCA component timer:darwin (m4 configuration macro, priority 30)
checking for MCA component timer:darwin compile mode... static
checking mach/mach_time.h usability... no
checking mach/mach_time.h presence... no
checking for mach/mach_time.h... no
checking for mach_absolute_time... no
checking if MCA component timer:darwin can compile... no

--- MCA component timer:linux (m4 configuration macro, priority 30)
checking for MCA component timer:linux compile mode... static
checking if MCA component timer:linux can compile... yes

--- MCA component timer:solaris (m4 configuration macro, priority 30)
checking for MCA component timer:solaris compile mode... static
checking for gethrtime... no
checking if MCA component timer:solaris can compile... no

*** Configuring MCA for orte
checking for frameworks for orte... common, errmgr, ess, filem, grpcomm, iof, odls, oob, plm, ras, regx, rmaps, rml, routed, rtc, schizo, snapc, sstore, state

+++ Configuring MCA framework common
checking for no configure components in framework common... 
checking for m4 configure components in framework common... alps

--- MCA component common:alps (m4 configuration macro)
checking for MCA component common:alps compile mode... dso
checking if MCA component common:alps can compile... no

+++ Configuring MCA framework errmgr
checking for no configure components in framework errmgr... default_app, default_hnp, default_orted, default_tool
checking for m4 configure components in framework errmgr... 

--- MCA component errmgr:default_app (no configuration)
checking for MCA component errmgr:default_app compile mode... dso
checking if MCA component errmgr:default_app can compile... yes

--- MCA component errmgr:default_hnp (no configuration)
checking for MCA component errmgr:default_hnp compile mode... dso
checking if MCA component errmgr:default_hnp can compile... yes

--- MCA component errmgr:default_orted (no configuration)
checking for MCA component errmgr:default_orted compile mode... dso
checking if MCA component errmgr:default_orted can compile... yes

--- MCA component errmgr:default_tool (no configuration)
checking for MCA component errmgr:default_tool compile mode... dso
checking if MCA component errmgr:default_tool can compile... yes

+++ Configuring MCA framework ess
checking for no configure components in framework ess... env, hnp, pmi, singleton, tool
checking for m4 configure components in framework ess... alps, lsf, slurm, tm

--- MCA component ess:env (no configuration)
checking for MCA component ess:env compile mode... dso
checking if MCA component ess:env can compile... yes

--- MCA component ess:hnp (no configuration)
checking for MCA component ess:hnp compile mode... dso
checking if MCA component ess:hnp can compile... yes

--- MCA component ess:pmi (no configuration)
checking for MCA component ess:pmi compile mode... dso
checking if MCA component ess:pmi can compile... yes

--- MCA component ess:singleton (no configuration)
checking for MCA component ess:singleton compile mode... dso
checking if MCA component ess:singleton can compile... yes

--- MCA component ess:tool (no configuration)
checking for MCA component ess:tool compile mode... dso
checking if MCA component ess:tool can compile... yes

--- MCA component ess:alps (m4 configuration macro)
checking for MCA component ess:alps compile mode... dso
checking if MCA component ess:alps can compile... no

--- MCA component ess:lsf (m4 configuration macro)
checking for MCA component ess:lsf compile mode... dso
checking --with-lsf value... simple ok (unspecified value)
checking --with-lsf-libdir value... simple ok (unspecified value)
checking for library containing yp_all... -lnsl
checking lsf/lsf.h usability... no
checking lsf/lsf.h presence... no
checking for lsf/lsf.h... no
checking if MCA component ess:lsf can compile... no

--- MCA component ess:slurm (m4 configuration macro)
checking for MCA component ess:slurm compile mode... dso
checking for fork... (cached) yes
checking for execve... (cached) yes
checking for setpgid... (cached) yes
checking if MCA component ess:slurm can compile... yes

--- MCA component ess:tm (m4 configuration macro)
checking for MCA component ess:tm compile mode... dso
checking --with-tm value... simple ok (unspecified value)
checking for pbs-config... not found
checking tm.h usability... no
checking tm.h presence... no
checking for tm.h... no
checking if MCA component ess:tm can compile... no

+++ Configuring MCA framework filem
checking for no configure components in framework filem... raw
checking for m4 configure components in framework filem... 

--- MCA component filem:raw (no configuration)
checking for MCA component filem:raw compile mode... dso
checking if MCA component filem:raw can compile... yes

+++ Configuring MCA framework grpcomm
checking for no configure components in framework grpcomm... direct
checking for m4 configure components in framework grpcomm... 

--- MCA component grpcomm:direct (no configuration)
checking for MCA component grpcomm:direct compile mode... dso
checking if MCA component grpcomm:direct can compile... yes

+++ Configuring MCA framework iof
checking for no configure components in framework iof... hnp, orted, tool
checking for m4 configure components in framework iof... 

--- MCA component iof:hnp (no configuration)
checking for MCA component iof:hnp compile mode... dso
checking if MCA component iof:hnp can compile... yes

--- MCA component iof:orted (no configuration)
checking for MCA component iof:orted compile mode... dso
checking if MCA component iof:orted can compile... yes

--- MCA component iof:tool (no configuration)
checking for MCA component iof:tool compile mode... dso
checking if MCA component iof:tool can compile... yes

+++ Configuring MCA framework odls
checking for no configure components in framework odls... 
checking for m4 configure components in framework odls... alps, default, pspawn

--- MCA component odls:alps (m4 configuration macro)
checking for MCA component odls:alps compile mode... dso
checking if MCA component odls:alps can compile... no

--- MCA component odls:default (m4 configuration macro)
checking for MCA component odls:default compile mode... dso
checking for fork... (cached) yes
checking if MCA component odls:default can compile... yes

--- MCA component odls:pspawn (m4 configuration macro)
checking for MCA component odls:pspawn compile mode... dso
checking for posix_spawn... yes
checking if MCA component odls:pspawn can compile... yes

+++ Configuring MCA framework oob
checking for no configure components in framework oob... 
checking for m4 configure components in framework oob... alps, tcp

--- MCA component oob:alps (m4 configuration macro)
checking for MCA component oob:alps compile mode... dso
checking if MCA component oob:alps can compile... no

--- MCA component oob:tcp (m4 configuration macro)
checking for MCA component oob:tcp compile mode... dso
checking for struct sockaddr_in... (cached) yes
checking if MCA component oob:tcp can compile... yes

+++ Configuring MCA framework plm
checking for no configure components in framework plm... 
checking for m4 configure components in framework plm... alps, isolated, lsf, rsh, slurm, tm

--- MCA component plm:alps (m4 configuration macro)
checking for MCA component plm:alps compile mode... dso
checking if MCA component plm:alps can compile... no

--- MCA component plm:isolated (m4 configuration macro)
checking for MCA component plm:isolated compile mode... dso
checking for fork... (cached) yes
checking if MCA component plm:isolated can compile... yes

--- MCA component plm:lsf (m4 configuration macro)
checking for MCA component plm:lsf compile mode... dso
checking if MCA component plm:lsf can compile... no

--- MCA component plm:rsh (m4 configuration macro)
checking for MCA component plm:rsh compile mode... dso
checking for fork... (cached) yes
checking if MCA component plm:rsh can compile... yes

--- MCA component plm:slurm (m4 configuration macro)
checking for MCA component plm:slurm compile mode... dso
checking if MCA component plm:slurm can compile... yes

--- MCA component plm:tm (m4 configuration macro)
checking for MCA component plm:tm compile mode... dso
checking if MCA component plm:tm can compile... no

+++ Configuring MCA framework ras
checking for no configure components in framework ras... simulator
checking for m4 configure components in framework ras... alps, gridengine, lsf, slurm, tm

--- MCA component ras:simulator (no configuration)
checking for MCA component ras:simulator compile mode... dso
checking if MCA component ras:simulator can compile... yes

--- MCA component ras:alps (m4 configuration macro)
checking for MCA component ras:alps compile mode... dso
checking alps/apInfo.h usability... no
checking alps/apInfo.h presence... no
checking for alps/apInfo.h... no
checking if MCA component ras:alps can compile... no

--- MCA component ras:gridengine (m4 configuration macro)
checking for MCA component ras:gridengine compile mode... dso
checking if user requested SGE build... not specified; checking environment
checking for qrsh... no
checking for SGE_ROOT environment variable... not found
checking if MCA component ras:gridengine can compile... no

--- MCA component ras:lsf (m4 configuration macro)
checking for MCA component ras:lsf compile mode... dso
checking if MCA component ras:lsf can compile... no

--- MCA component ras:slurm (m4 configuration macro)
checking for MCA component ras:slurm compile mode... dso
checking if MCA component ras:slurm can compile... yes

--- MCA component ras:tm (m4 configuration macro)
checking for MCA component ras:tm compile mode... dso
checking if MCA component ras:tm can compile... no

+++ Configuring MCA framework regx
checking for no configure components in framework regx... fwd, naive, reverse
checking for m4 configure components in framework regx... 

--- MCA component regx:fwd (no configuration)
checking for MCA component regx:fwd compile mode... dso
checking if MCA component regx:fwd can compile... yes

--- MCA component regx:naive (no configuration)
checking for MCA component regx:naive compile mode... dso
checking if MCA component regx:naive can compile... yes

--- MCA component regx:reverse (no configuration)
checking for MCA component regx:reverse compile mode... dso
checking if MCA component regx:reverse can compile... yes

+++ Configuring MCA framework rmaps
checking for no configure components in framework rmaps... mindist, ppr, rank_file, resilient, round_robin, seq
checking for m4 configure components in framework rmaps... 

--- MCA component rmaps:mindist (no configuration)
checking for MCA component rmaps:mindist compile mode... dso
checking if MCA component rmaps:mindist can compile... yes

--- MCA component rmaps:ppr (no configuration)
checking for MCA component rmaps:ppr compile mode... dso
checking if MCA component rmaps:ppr can compile... yes

--- MCA component rmaps:rank_file (no configuration)
checking for MCA component rmaps:rank_file compile mode... dso
checking if MCA component rmaps:rank_file can compile... yes

--- MCA component rmaps:resilient (no configuration)
checking for MCA component rmaps:resilient compile mode... dso
checking if MCA component rmaps:resilient can compile... yes

--- MCA component rmaps:round_robin (no configuration)
checking for MCA component rmaps:round_robin compile mode... dso
checking if MCA component rmaps:round_robin can compile... yes

--- MCA component rmaps:seq (no configuration)
checking for MCA component rmaps:seq compile mode... dso
checking if MCA component rmaps:seq can compile... yes

+++ Configuring MCA framework rml
checking for no configure components in framework rml... oob
checking for m4 configure components in framework rml... 

--- MCA component rml:oob (no configuration)
checking for MCA component rml:oob compile mode... dso
checking if MCA component rml:oob can compile... yes

+++ Configuring MCA framework routed
checking for no configure components in framework routed... binomial, direct, radix
checking for m4 configure components in framework routed... 

--- MCA component routed:binomial (no configuration)
checking for MCA component routed:binomial compile mode... dso
checking if MCA component routed:binomial can compile... yes

--- MCA component routed:direct (no configuration)
checking for MCA component routed:direct compile mode... dso
checking if MCA component routed:direct can compile... yes

--- MCA component routed:radix (no configuration)
checking for MCA component routed:radix compile mode... dso
checking if MCA component routed:radix can compile... yes

+++ Configuring MCA framework rtc
checking for no configure components in framework rtc... hwloc
checking for m4 configure components in framework rtc... 

--- MCA component rtc:hwloc (no configuration)
checking for MCA component rtc:hwloc compile mode... dso
checking if MCA component rtc:hwloc can compile... yes

+++ Configuring MCA framework schizo
checking for no configure components in framework schizo... flux, ompi, orte
checking for m4 configure components in framework schizo... alps, moab, singularity, slurm

--- MCA component schizo:flux (no configuration)
checking for MCA component schizo:flux compile mode... dso
checking if MCA component schizo:flux can compile... yes

--- MCA component schizo:ompi (no configuration)
checking for MCA component schizo:ompi compile mode... dso
checking if MCA component schizo:ompi can compile... yes

--- MCA component schizo:orte (no configuration)
checking for MCA component schizo:orte compile mode... dso
checking if MCA component schizo:orte can compile... yes

--- MCA component schizo:alps (m4 configuration macro)
checking for MCA component schizo:alps compile mode... dso
checking for alps/apInfo.h... (cached) no
checking if MCA component schizo:alps can compile... no

--- MCA component schizo:moab (m4 configuration macro)
checking for MCA component schizo:moab compile mode... dso
checking --with-moab value... simple ok (unspecified value)
checking --with-moab-libdir value... simple ok (unspecified value)
checking looking for moab in... ()
checking mapi.h usability... no
checking mapi.h presence... no
checking for mapi.h... no
checking if MCA component schizo:moab can compile... no

--- MCA component schizo:singularity (m4 configuration macro)
checking for MCA component schizo:singularity compile mode... dso
checking if Singularity support is to be built... yes
checking for singularity... no
checking if MCA component schizo:singularity can compile... no

--- MCA component schizo:slurm (m4 configuration macro)
checking for MCA component schizo:slurm compile mode... dso
checking if MCA component schizo:slurm can compile... yes

+++ Configuring MCA framework snapc
checking for no configure components in framework snapc... 
checking for m4 configure components in framework snapc... full

--- MCA component snapc:full (m4 configuration macro)
checking for MCA component snapc:full compile mode... dso
checking if MCA component snapc:full can compile... no

+++ Configuring MCA framework sstore
checking for no configure components in framework sstore... 
checking for m4 configure components in framework sstore... central, stage

--- MCA component sstore:central (m4 configuration macro)
checking for MCA component sstore:central compile mode... dso
checking if MCA component sstore:central can compile... no

--- MCA component sstore:stage (m4 configuration macro)
checking for MCA component sstore:stage compile mode... dso
checking if MCA component sstore:stage can compile... no

+++ Configuring MCA framework state
checking for no configure components in framework state... app, hnp, novm, orted, tool
checking for m4 configure components in framework state... 

--- MCA component state:app (no configuration)
checking for MCA component state:app compile mode... dso
checking if MCA component state:app can compile... yes

--- MCA component state:hnp (no configuration)
checking for MCA component state:hnp compile mode... dso
checking if MCA component state:hnp can compile... yes

--- MCA component state:novm (no configuration)
checking for MCA component state:novm compile mode... dso
checking if MCA component state:novm can compile... yes

--- MCA component state:orted (no configuration)
checking for MCA component state:orted compile mode... dso
checking if MCA component state:orted can compile... yes

--- MCA component state:tool (no configuration)
checking for MCA component state:tool compile mode... dso
checking if MCA component state:tool can compile... yes

*** Configuring MCA for ompi
checking for frameworks for ompi... common, bml, coll, crcp, fbtl, fcoll, fs, hook, io, mtl, op, osc, pml, rte, sharedfp, topo, vprotocol

+++ Configuring MCA framework common
checking for no configure components in framework common... 
checking for m4 configure components in framework common... monitoring, ompio

--- MCA component common:monitoring (m4 configuration macro)
checking for MCA component common:monitoring compile mode... dso
checking if MCA component common:monitoring can compile... yes

--- MCA component common:ompio (m4 configuration macro)
checking for MCA component common:ompio compile mode... dso
checking if MCA component common:ompio can compile... yes

+++ Configuring MCA framework bml
checking for no configure components in framework bml... 
checking for m4 configure components in framework bml... r2

--- MCA component bml:r2 (m4 configuration macro)
checking for MCA component bml:r2 compile mode... dso
checking if MCA component bml:r2 can compile... yes
checking for index in endpoint array for tag BML... 0

+++ Configuring MCA framework coll
checking for no configure components in framework coll... basic, inter, libnbc, self, sm, sync, tuned
checking for m4 configure components in framework coll... cuda, fca, hcoll, monitoring, portals4

--- MCA component coll:basic (no configuration)
checking for MCA component coll:basic compile mode... dso
checking if MCA component coll:basic can compile... yes

--- MCA component coll:inter (no configuration)
checking for MCA component coll:inter compile mode... dso
checking if MCA component coll:inter can compile... yes

--- MCA component coll:libnbc (no configuration)
checking for MCA component coll:libnbc compile mode... dso
checking if MCA component coll:libnbc can compile... yes

--- MCA component coll:self (no configuration)
checking for MCA component coll:self compile mode... dso
checking if MCA component coll:self can compile... yes

--- MCA component coll:sm (no configuration)
checking for MCA component coll:sm compile mode... dso
checking if MCA component coll:sm can compile... yes

--- MCA component coll:sync (no configuration)
checking for MCA component coll:sync compile mode... dso
checking if MCA component coll:sync can compile... yes

--- MCA component coll:tuned (no configuration)
checking for MCA component coll:tuned compile mode... dso
checking if MCA component coll:tuned can compile... yes

--- MCA component coll:cuda (m4 configuration macro)
checking for MCA component coll:cuda compile mode... dso
checking if MCA component coll:cuda can compile... yes

--- MCA component coll:fca (m4 configuration macro)
checking for MCA component coll:fca compile mode... dso
checking fca/fca_api.h usability... no
checking fca/fca_api.h presence... no
checking for fca/fca_api.h... no
checking if MCA component coll:fca can compile... no

--- MCA component coll:hcoll (m4 configuration macro)
checking for MCA component coll:hcoll compile mode... dso
checking hcoll/api/hcoll_api.h usability... no
checking hcoll/api/hcoll_api.h presence... no
checking for hcoll/api/hcoll_api.h... no
checking if MCA component coll:hcoll can compile... no

--- MCA component coll:monitoring (m4 configuration macro)
checking for MCA component coll:monitoring compile mode... dso
checking if MCA component coll:monitoring can compile... yes

--- MCA component coll:portals4 (m4 configuration macro)
checking for MCA component coll:portals4 compile mode... dso
checking if MCA component coll:portals4 can compile... no

+++ Configuring MCA framework crcp
checking for no configure components in framework crcp... 
checking for m4 configure components in framework crcp... bkmrk

--- MCA component crcp:bkmrk (m4 configuration macro)
checking for MCA component crcp:bkmrk compile mode... dso
checking if MCA component crcp:bkmrk can compile... no

+++ Configuring MCA framework fbtl
checking for no configure components in framework fbtl... 
checking for m4 configure components in framework fbtl... posix, pvfs2

--- MCA component fbtl:posix (m4 configuration macro)
checking for MCA component fbtl:posix compile mode... dso
checking for aio.h... (cached) yes
checking for library containing aio_write... none required
checking for pwritev... yes
checking for preadv... yes
checking if MCA component fbtl:posix can compile... yes

--- MCA component fbtl:pvfs2 (m4 configuration macro)
checking for MCA component fbtl:pvfs2 compile mode... dso
checking --with-pvfs2 value... simple ok (unspecified value)
looking for header without includes
checking pvfs2.h usability... no
checking pvfs2.h presence... no
checking for pvfs2.h... no
checking pvfs2.h usability... no
checking pvfs2.h presence... no
checking for pvfs2.h... no
checking if MCA component fbtl:pvfs2 can compile... no

+++ Configuring MCA framework fcoll
checking for no configure components in framework fcoll... dynamic, dynamic_gen2, individual, two_phase, vulcan
checking for m4 configure components in framework fcoll... 

--- MCA component fcoll:dynamic (no configuration)
checking for MCA component fcoll:dynamic compile mode... dso
checking if MCA component fcoll:dynamic can compile... yes

--- MCA component fcoll:dynamic_gen2 (no configuration)
checking for MCA component fcoll:dynamic_gen2 compile mode... dso
checking if MCA component fcoll:dynamic_gen2 can compile... yes

--- MCA component fcoll:individual (no configuration)
checking for MCA component fcoll:individual compile mode... dso
checking if MCA component fcoll:individual can compile... yes

--- MCA component fcoll:two_phase (no configuration)
checking for MCA component fcoll:two_phase compile mode... dso
checking if MCA component fcoll:two_phase can compile... yes

--- MCA component fcoll:vulcan (no configuration)
checking for MCA component fcoll:vulcan compile mode... dso
checking if MCA component fcoll:vulcan can compile... yes

+++ Configuring MCA framework fs
checking for no configure components in framework fs... 
checking for m4 configure components in framework fs... lustre, pvfs2, ufs

--- MCA component fs:lustre (m4 configuration macro)
checking for MCA component fs:lustre compile mode... dso
checking --with-lustre value... simple ok (unspecified value)
looking for header without includes
checking lustre/lustreapi.h usability... no
checking lustre/lustreapi.h presence... no
checking for lustre/lustreapi.h... no
checking lustre/lustreapi.h usability... no
checking lustre/lustreapi.h presence... no
checking for lustre/lustreapi.h... no
checking for required lustre data structures... no
checking if MCA component fs:lustre can compile... no

--- MCA component fs:pvfs2 (m4 configuration macro)
checking for MCA component fs:pvfs2 compile mode... dso
checking --with-pvfs2 value... simple ok (unspecified value)
looking for header without includes
checking pvfs2.h usability... no
checking pvfs2.h presence... no
checking for pvfs2.h... no
checking pvfs2.h usability... no
checking pvfs2.h presence... no
checking for pvfs2.h... no
checking if MCA component fs:pvfs2 can compile... no

--- MCA component fs:ufs (m4 configuration macro)
checking for MCA component fs:ufs compile mode... dso
checking if MCA component fs:ufs can compile... yes

+++ Configuring MCA framework hook
checking for no configure components in framework hook... 
checking for m4 configure components in framework hook... 

+++ Configuring MCA framework io
checking for no configure components in framework io... 
checking for m4 configure components in framework io... ompio, romio321

--- MCA component io:ompio (m4 configuration macro)
checking for MCA component io:ompio compile mode... dso
checking if MCA component io:ompio can compile... yes

--- MCA component io:romio321 (m4 configuration macro)
checking for MCA component io:romio321 compile mode... dso
checking if want ROMIO component... yes
checking if MPI profiling is enabled... yes

*** Configuring ROMIO distribution
configure: OPAL configuring in ompi/mca/io/romio321/romio
configure: running /bin/bash '../../../../../../openmpi/ompi/mca/io/romio321/romio/configure'  FROM_OMPI=yes CC="gcc" CFLAGS="-O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -D__EXTENSIONS__" CPPFLAGS="-I/scratch/build/opal/mca/event/libevent2022/libevent/include -I/scratch/openmpi/opal/mca/event/libevent2022/libevent -I/scratch/openmpi/opal/mca/event/libevent2022/libevent/include -I/scratch/build/opal/mca/hwloc/hwloc201/hwloc/include -I/scratch/openmpi/opal/mca/hwloc/hwloc201/hwloc/include  " FFLAGS="" LDFLAGS=" " --enable-shared --disable-static  --prefix=/opt/openmpi --disable-aio --disable-weak-symbols --enable-strict --disable-f77 --disable-f90 --cache-file=/dev/null --srcdir=../../../../../../openmpi/ompi/mca/io/romio321/romio --disable-option-checking
checking for a BSD-compatible install... /usr/bin/install -c
checking whether build environment is sane... yes
checking for a thread-safe mkdir -p... /usr/bin/mkdir -p
checking for gawk... no
checking for mawk... mawk
checking whether make sets $(MAKE)... yes
checking whether make supports nested variables... yes
checking whether to enable maintainer-specific portions of Makefiles... yes
checking for style of include used by make... GNU
checking whether the C compiler works... yes
checking for C compiler default output file name... a.out
checking for suffix of executables... 
checking whether we are cross compiling... no
checking for suffix of object files... o
checking whether we are using the GNU C compiler... yes
checking whether gcc accepts -g... yes
checking for gcc option to accept ISO C89... none needed
checking whether gcc understands -c and -o together... yes
checking dependency style of gcc... gcc3
checking for ar... ar
checking the archiver (ar) interface... ar
checking build system type... x86_64-unknown-linux-gnu
checking host system type... x86_64-unknown-linux-gnu
checking how to print strings... printf
checking for a sed that does not truncate output... /usr/bin/sed
checking for grep that handles long lines and -e... /usr/bin/grep
checking for egrep... /usr/bin/grep -E
checking for fgrep... /usr/bin/grep -F
checking for ld used by gcc... /usr/bin/ld
checking if the linker (/usr/bin/ld) is GNU ld... yes
checking for BSD- or MS-compatible name lister (nm)... /usr/bin/nm -B
checking the name lister (/usr/bin/nm -B) interface... BSD nm
checking whether ln -s works... yes
checking the maximum length of command line arguments... 1572864
checking how to convert x86_64-unknown-linux-gnu file names to x86_64-unknown-linux-gnu format... func_convert_file_noop
checking how to convert x86_64-unknown-linux-gnu file names to toolchain format... func_convert_file_noop
checking for /usr/bin/ld option to reload object files... -r
checking for objdump... objdump
checking how to recognize dependent libraries... pass_all
checking for dlltool... no
checking how to associate runtime and link libraries... printf %s\n
checking for archiver @FILE support... @
checking for strip... strip
checking for ranlib... ranlib
checking command to parse /usr/bin/nm -B output from gcc object... ok
checking for sysroot... no
checking for a working dd... /usr/bin/dd
checking how to truncate binary pipes... /usr/bin/dd bs=4096 count=1
../../../../../../openmpi/ompi/mca/io/romio321/romio/configure: line 7111: /usr/bin/file: No such file or directory
checking for mt... no
checking if : is a manifest tool... no
checking how to run the C preprocessor... gcc -E
checking for ANSI C header files... yes
checking for sys/types.h... yes
checking for sys/stat.h... yes
checking for stdlib.h... yes
checking for string.h... yes
checking for memory.h... yes
checking for strings.h... yes
checking for inttypes.h... yes
checking for stdint.h... yes
checking for unistd.h... yes
checking for dlfcn.h... yes
checking for objdir... .libs
checking if gcc supports -fno-rtti -fno-exceptions... yes
checking for gcc option to produce PIC... -fPIC -DPIC
checking if gcc PIC flag -fPIC -DPIC works... yes
checking if gcc static flag -static works... yes
checking if gcc supports -c -o file.o... yes
checking if gcc supports -c -o file.o... (cached) yes
checking whether the gcc linker (/usr/bin/ld) supports shared libraries... yes
checking whether -lc should be explicitly linked in... no
checking dynamic linker characteristics... GNU/Linux ld.so
checking how to hardcode library paths into programs... immediate
checking whether stripping libraries is possible... yes
checking if libtool supports shared libraries... yes
checking whether to build shared libraries... yes
checking whether to build static libraries... no
checking whether make supports nested variables... (cached) yes
Configuring with args dummy mt
checking for Open MPI support files... in Open MPI source tree -- good
checking for make... make
checking whether clock skew breaks make... no
checking whether make supports include... yes
checking whether make allows comments in actions... yes
checking for virtual path format... VPATH
checking whether make sets CFLAGS... yes
configure: WARNING: Unknown architecture ... proceeding anyway
ROMIO home directory is ../../../../../../openmpi/ompi/mca/io/romio321/romio
checking for long long... yes
checking size of long long... 8
checking for memalign... yes
checking for size_t... yes
checking for ssize_t... yes
checking for off_t... yes
checking how to run the C preprocessor... gcc -E
checking for unistd.h... (cached) yes
checking fcntl.h usability... yes
checking fcntl.h presence... yes
checking for fcntl.h... yes
checking malloc.h usability... yes
checking malloc.h presence... yes
checking for malloc.h... yes
checking stddef.h usability... yes
checking stddef.h presence... yes
checking for stddef.h... yes
checking for sys/types.h... (cached) yes
checking limits.h usability... yes
checking limits.h presence... yes
checking for limits.h... yes
checking time.h usability... yes
checking time.h presence... yes
checking for time.h... yes
checking for mpix.h... no
checking for u_char... yes
checking for u_short... yes
checking for u_int... yes
checking for u_long... yes
checking sys/attr.h usability... no
checking sys/attr.h presence... no
checking for sys/attr.h... no
checking size of int... 4
checking size of void *... 8
checking for int large enough for pointers... no
checking size of long long... (cached) 8
checking whether struct flock compatible with MPI_Offset... yes
checking for pvfs2-config... notfound
checking configured file systems... testfs ufs nfs
configure: WARNING: File locks may not work with NFS.  See the Installation and
users manual for instructions on testing and if necessary fixing this
checking sys/vfs.h usability... yes
checking sys/vfs.h presence... yes
checking for sys/vfs.h... yes
checking sys/param.h usability... yes
checking sys/param.h presence... yes
checking for sys/param.h... yes
checking sys/mount.h usability... yes
checking sys/mount.h presence... yes
checking for sys/mount.h... yes
checking sys/statvfs.h usability... yes
checking sys/statvfs.h presence... yes
checking for sys/statvfs.h... yes
checking for statfs... yes
checking whether struct statfs properly defined... yes
checking for f_fstypename member of statfs structure... no
checking for sys/stat.h... (cached) yes
checking for sys/types.h... (cached) yes
checking for unistd.h... (cached) yes
checking for stat... yes
checking for st_fstype member of stat structure... no
checking for sys/types.h... (cached) yes
checking for sys/statvfs.h... (cached) yes
checking for sys/vfs.h... (cached) yes
checking for statvfs... yes
checking for f_basetype member of statvfs structure... no
checking for blksize_t... yes
checking for special C compiler options needed for large files... no
checking for _FILE_OFFSET_BITS value needed for large files... no
checking whether pwrite is declared... yes
checking for strerror... yes
checking for doctext... no
checking for strdup... yes
checking whether strdup needs a declaration... no
checking for snprintf... yes
checking whether snprintf needs a declaration... no
checking for lstat... yes
checking whether lstat needs a declaration... no
checking for readlink... yes
checking whether readlink needs a declaration... no
checking for fsync... yes
checking whether fsync needs a declaration... no
checking for ftruncate... yes
checking whether ftruncate needs a declaration... no
checking for lseek64... yes
checking whether lseek64 needs a declaration... yes
checking for usleep... yes
checking whether usleep needs a declaration... no
setting SYSDEP_INC to 
checking for C/C++ restrict keyword... __restrict
checking whether __attribute__ allowed... yes
checking whether __attribute__((format)) allowed... yes
checking for gcov... gcov
setting CC to gcc
setting F77 to :
setting TEST_CC to mpicc
setting TEST_F77 to mpifort
setting CFLAGS to -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -D__EXTENSIONS__ -DHAVE_ROMIOCONF_H
setting USER_CFLAGS to 
setting USER_FFLAGS to 
checking that generated files are newer than configure... done
configure: creating ./config.status
config.status: creating Makefile
config.status: creating localdefs
config.status: creating test/Makefile
config.status: creating test/misc.c
config.status: creating test/large_file.c
config.status: creating test/runtests
config.status: creating test-internal/Makefile
config.status: creating util/romioinstall
config.status: creating include/mpio.h
config.status: creating test/fmisc.f
config.status: creating test/fcoll_test.f
config.status: creating test/pfcoll_test.f
config.status: creating test/fperf.f
config.status: creating adio/include/romioconf.h
config.status: executing depfiles commands
config.status: executing libtool commands
config.status: executing default-1 commands
configure: /bin/bash '../../../../../../openmpi/ompi/mca/io/romio321/romio/configure' succeeded for ompi/mca/io/romio321/romio
ROMIO distribution configured successfully
checking if MCA component io:romio321 can compile... yes

+++ Configuring MCA framework mtl
checking for no configure components in framework mtl... 
checking for m4 configure components in framework mtl... ofi, portals4, psm, psm2

--- MCA component mtl:ofi (m4 configuration macro)
checking for MCA component mtl:ofi compile mode... dso
checking if OFI libfabric is available... no
checking if MCA component mtl:ofi can compile... no

--- MCA component mtl:portals4 (m4 configuration macro)
checking for MCA component mtl:portals4 compile mode... dso
checking whether to enable flow control... yes
checking if MCA component mtl:portals4 can compile... no

--- MCA component mtl:psm (m4 configuration macro)
checking for MCA component mtl:psm compile mode... dso
checking --with-psm value... simple ok (unspecified value)
checking --with-psm-libdir value... simple ok (unspecified value)
checking psm.h usability... no
checking psm.h presence... no
checking for psm.h... no
checking if MCA component mtl:psm can compile... no

--- MCA component mtl:psm2 (m4 configuration macro)
checking for MCA component mtl:psm2 compile mode... dso
checking --with-psm2 value... simple ok (unspecified value)
checking --with-psm2-libdir value... simple ok (unspecified value)
checking psm2.h usability... no
checking psm2.h presence... no
checking for psm2.h... no
checking if MCA component mtl:psm2 can compile... no

+++ Configuring MCA framework op
checking for no configure components in framework op... 
checking for m4 configure components in framework op... 

+++ Configuring MCA framework osc
checking for no configure components in framework osc... sm
checking for m4 configure components in framework osc... monitoring, portals4, pt2pt, rdma, ucx

--- MCA component osc:sm (no configuration)
checking for MCA component osc:sm compile mode... dso
checking if MCA component osc:sm can compile... yes

--- MCA component osc:monitoring (m4 configuration macro)
checking for MCA component osc:monitoring compile mode... dso
checking if MCA component osc:monitoring can compile... yes

--- MCA component osc:portals4 (m4 configuration macro)
checking for MCA component osc:portals4 compile mode... dso
checking if MCA component osc:portals4 can compile... no

--- MCA component osc:pt2pt (m4 configuration macro)
checking for MCA component osc:pt2pt compile mode... dso
checking if MCA component osc:pt2pt can compile... yes

--- MCA component osc:rdma (m4 configuration macro)
checking for MCA component osc:rdma compile mode... dso
checking if MCA component osc:rdma can compile... yes
checking for index in endpoint array for tag BML... 0

--- MCA component osc:ucx (m4 configuration macro)
checking for MCA component osc:ucx compile mode... dso
checking if MCA component osc:ucx can compile... no

+++ Configuring MCA framework pml
checking for no configure components in framework pml... cm
checking for m4 configure components in framework pml... crcpw, monitoring, ob1, ucx, v, yalla

--- MCA component pml:cm (no configuration)
checking for MCA component pml:cm compile mode... dso
checking if MCA component pml:cm can compile... yes

--- MCA component pml:crcpw (m4 configuration macro)
checking for MCA component pml:crcpw compile mode... dso
checking if MCA component pml:crcpw can compile... no

--- MCA component pml:monitoring (m4 configuration macro)
checking for MCA component pml:monitoring compile mode... dso
checking if MCA component pml:monitoring can compile... yes

--- MCA component pml:ob1 (m4 configuration macro)
checking for MCA component pml:ob1 compile mode... dso
checking if MCA component pml:ob1 can compile... yes
checking for index in endpoint array for tag BML... 0

--- MCA component pml:ucx (m4 configuration macro)
checking for MCA component pml:ucx compile mode... dso
checking if MCA component pml:ucx can compile... no

--- MCA component pml:v (m4 configuration macro)
checking for MCA component pml:v compile mode... static
checking if MCA component pml:v can compile... yes

--- MCA component pml:yalla (m4 configuration macro)
checking for MCA component pml:yalla compile mode... dso
checking --with-mxm-libdir value... simple ok (unspecified value)
checking mxm/api/mxm_api.h usability... no
checking mxm/api/mxm_api.h presence... no
checking for mxm/api/mxm_api.h... no
checking for MXM version compatibility... no
checking if MCA component pml:yalla can compile... no

+++ Configuring MCA framework rte
checking for no configure components in framework rte... 
checking for m4 configure components in framework rte... orte, pmix

--- MCA component rte:pmix (m4 configuration macro, priority 50)
checking for MCA component rte:pmix compile mode... static
checking if MCA component rte:pmix can compile... no

--- MCA component rte:orte (m4 configuration macro, priority 10)
checking for MCA component rte:orte compile mode... static
checking if MCA component rte:orte can compile... yes

+++ Configuring MCA framework sharedfp
checking for no configure components in framework sharedfp... individual, lockedfile
checking for m4 configure components in framework sharedfp... sm

--- MCA component sharedfp:individual (no configuration)
checking for MCA component sharedfp:individual compile mode... dso
checking if MCA component sharedfp:individual can compile... yes

--- MCA component sharedfp:lockedfile (no configuration)
checking for MCA component sharedfp:lockedfile compile mode... dso
checking if MCA component sharedfp:lockedfile can compile... yes

--- MCA component sharedfp:sm (m4 configuration macro)
checking for MCA component sharedfp:sm compile mode... dso
checking semaphore.h usability... yes
checking semaphore.h presence... yes
checking for semaphore.h... yes
checking for sem_open... yes
checking for semaphore.h... (cached) yes
checking for sem_init... yes
checking if MCA component sharedfp:sm can compile... yes

+++ Configuring MCA framework topo
checking for no configure components in framework topo... basic
checking for m4 configure components in framework topo... treematch

--- MCA component topo:basic (no configuration)
checking for MCA component topo:basic compile mode... dso
checking if MCA component topo:basic can compile... yes

--- MCA component topo:treematch (m4 configuration macro)
checking for MCA component topo:treematch compile mode... dso
checking TreeMatch headers... in the source
checking --with-treematch value... sanity check ok (/scratch/openmpi/ompi/mca/topo/treematch/treematch)
checking if MCA component topo:treematch can compile... yes

+++ Configuring MCA framework vprotocol
checking for no configure components in framework vprotocol... pessimist
checking for m4 configure components in framework vprotocol... 

--- MCA component vprotocol:pessimist (no configuration)
checking for MCA component vprotocol:pessimist compile mode... dso
checking if MCA component vprotocol:pessimist can compile... yes

*** Configuring MCA for oshmem
checking for frameworks for oshmem... atomic, memheap, scoll, spml, sshmem

+++ Configuring MCA framework atomic
checking for no configure components in framework atomic... basic
checking for m4 configure components in framework atomic... mxm, ucx

--- MCA component atomic:basic (no configuration)
checking for MCA component atomic:basic compile mode... dso
checking if MCA component atomic:basic can compile... yes

--- MCA component atomic:mxm (m4 configuration macro)
checking for MCA component atomic:mxm compile mode... dso
checking if oshmem/atomic/mxm component can be compiled... no
checking if MCA component atomic:mxm can compile... no

--- MCA component atomic:ucx (m4 configuration macro)
checking for MCA component atomic:ucx compile mode... dso
checking if MCA component atomic:ucx can compile... no

+++ Configuring MCA framework memheap
checking for no configure components in framework memheap... buddy, ptmalloc
checking for m4 configure components in framework memheap... 

--- MCA component memheap:buddy (no configuration)
checking for MCA component memheap:buddy compile mode... dso
checking if MCA component memheap:buddy can compile... yes

--- MCA component memheap:ptmalloc (no configuration)
checking for MCA component memheap:ptmalloc compile mode... dso
checking if MCA component memheap:ptmalloc can compile... yes

+++ Configuring MCA framework scoll
checking for no configure components in framework scoll... basic, mpi
checking for m4 configure components in framework scoll... fca

--- MCA component scoll:basic (no configuration)
checking for MCA component scoll:basic compile mode... dso
checking if MCA component scoll:basic can compile... yes

--- MCA component scoll:mpi (no configuration)
checking for MCA component scoll:mpi compile mode... dso
checking if MCA component scoll:mpi can compile... yes

--- MCA component scoll:fca (m4 configuration macro)
checking for MCA component scoll:fca compile mode... dso
checking fca/fca_api.h usability... no
checking fca/fca_api.h presence... no
checking for fca/fca_api.h... no
checking if MCA component scoll:fca can compile... no

+++ Configuring MCA framework spml
checking for no configure components in framework spml... 
checking for m4 configure components in framework spml... ikrit, ucx

--- MCA component spml:ikrit (m4 configuration macro)
checking for MCA component spml:ikrit compile mode... dso
checking if MCA component spml:ikrit can compile... no

--- MCA component spml:ucx (m4 configuration macro)
checking for MCA component spml:ucx compile mode... dso
checking if MCA component spml:ucx can compile... no

+++ Configuring MCA framework sshmem
checking for no configure components in framework sshmem... 
checking for m4 configure components in framework sshmem... mmap, sysv, ucx, verbs

--- MCA component sshmem:mmap (m4 configuration macro)
checking for MCA component sshmem:mmap compile mode... dso
checking if want mmap shared memory support... yes
checking for library containing mmap... (cached) none required
checking if MCA component sshmem:mmap can compile... yes

--- MCA component sshmem:sysv (m4 configuration macro)
checking for MCA component sshmem:sysv compile mode... dso
checking if want SYSV shared memory support... yes
checking for shmget... (cached) yes
checking if MCA component sshmem:sysv can compile... yes

--- MCA component sshmem:ucx (m4 configuration macro)
checking for MCA component sshmem:ucx compile mode... dso
configure: UCX device memory allocation is not supported
checking if MCA component sshmem:ucx can compile... no

--- MCA component sshmem:verbs (m4 configuration macro)
checking for MCA component sshmem:verbs compile mode... dso
checking if want verbs shared memory support... yes
checking if MCA component sshmem:verbs can compile... no
checking for size of endpoint array... 1
configure: WARNING: No spml found.  Will not build OpenSHMEM layer.

============================================================================
== Extended MPI interfaces setup
============================================================================
checking for available MPI Extensions... affinity, cr, cuda, pcollreq
checking which MPI extension should be enabled... All Available Extensions

--- MPI Extension affinity
checking if MPI Extension affinity can compile... yes
checking if MPI Extension affinity has C bindings... yes (required)
checking if MPI Extension affinity has mpif.h bindings... no
checking if MPI Extension affinity has "use mpi" bindings... no
checking if MPI Extension affinity has "use mpi_f08" bindings... no

--- MPI Extension cr
checking if MPI Extension cr can compile... no

--- MPI Extension cuda
checking if MPI Extension cuda can compile... yes
checking if MPI Extension cuda has C bindings... yes (required)
checking if MPI Extension cuda has mpif.h bindings... no
checking if MPI Extension cuda has "use mpi" bindings... no
checking if MPI Extension cuda has "use mpi_f08" bindings... no

--- MPI Extension pcollreq
checking if MPI Extension pcollreq can compile... yes
checking if MPI Extension pcollreq has C bindings... yes (required)
checking if MPI Extension pcollreq has mpif.h bindings... yes
checking if MPI Extension pcollreq has "use mpi" bindings... yes
checking if MPI Extension pcollreq has "use mpi_f08" bindings... yes

============================================================================
== Contributed software setup
============================================================================

*** Configuring contributed software packages
checking which contributed software packages should be disabled... 

--- libompitrace (m4 configuration macro)
checking if contributed component libompitrace can compile... yes

============================================================================
== Symbol visibility feature
============================================================================
checking if gcc supports -fvisibility=hidden... yes
checking whether to enable symbol visibility... yes (via -fvisibility=hidden)

============================================================================
== Final top-level OMPI configuration
============================================================================

*** Libtool configuration
checking how to print strings... printf
checking for a sed that does not truncate output... /usr/bin/sed
checking for ld used by gcc... /usr/bin/ld
checking if the linker (/usr/bin/ld) is GNU ld... yes
checking the maximum length of command line arguments... 1572864
checking how to convert x86_64-unknown-linux-gnu file names to x86_64-unknown-linux-gnu format... func_convert_file_noop
checking how to convert x86_64-unknown-linux-gnu file names to toolchain format... func_convert_file_noop
checking for /usr/bin/ld option to reload object files... -r
checking how to recognize dependent libraries... pass_all
checking for dlltool... no
checking how to associate runtime and link libraries... printf %s\n
checking for ar... ar
checking for archiver @FILE support... @
checking for strip... strip
checking for ranlib... ranlib
checking command to parse /usr/bin/nm -B output from gcc object... ok
checking for sysroot... no
checking for a working dd... /usr/bin/dd
checking how to truncate binary pipes... /usr/bin/dd bs=4096 count=1
../openmpi/configure: line 385022: /usr/bin/file: No such file or directory
checking for mt... no
checking if : is a manifest tool... no
checking for dlfcn.h... (cached) yes
checking for objdir... .libs
checking if gcc supports -fno-rtti -fno-exceptions... yes
checking for gcc option to produce PIC... -fPIC -DPIC
checking if gcc PIC flag -fPIC -DPIC works... yes
checking if gcc static flag -static works... yes
checking if gcc supports -c -o file.o... yes
checking if gcc supports -c -o file.o... (cached) yes
checking whether the gcc linker (/usr/bin/ld) supports shared libraries... yes
checking whether -lc should be explicitly linked in... no
checking dynamic linker characteristics... GNU/Linux ld.so
checking how to hardcode library paths into programs... immediate
checking for shl_load... no
checking for shl_load in -ldld... no
checking for dlopen... no
checking for dlopen in -ldl... yes
checking whether a program can dlopen itself... yes
checking whether a statically linked program can dlopen itself... no
checking whether stripping libraries is possible... yes
checking if libtool supports shared libraries... yes
checking whether to build shared libraries... yes
checking whether to build static libraries... no
checking how to run the C++ preprocessor... g++ -E
checking for ld used by g++... /usr/bin/ld
checking if the linker (/usr/bin/ld) is GNU ld... yes
checking whether the g++ linker (/usr/bin/ld) supports shared libraries... yes
checking for g++ option to produce PIC... -fPIC -DPIC
checking if g++ PIC flag -fPIC -DPIC works... yes
checking if g++ static flag -static works... yes
checking if g++ supports -c -o file.o... yes
checking if g++ supports -c -o file.o... (cached) yes
checking whether the g++ linker (/usr/bin/ld) supports shared libraries... yes
checking dynamic linker characteristics... (cached) GNU/Linux ld.so
checking how to hardcode library paths into programs... immediate
configure: creating ./config.lt
config.lt: creating libtool

*** Compiler flags
checking which of CFLAGS are ok for debugger modules...  -DNDEBUG -w -fno-strict-aliasing -mcx16 -pthread
checking for debugger extra CFLAGS... -g
checking if -fasynchronous-unwind-tables compiler flag works... yes
checking for final compiler unwind flags... -fasynchronous-unwind-tables

*** Wrapper compiler final setup
checking for perl... /usr/bin/perl
checking if linker supports RPATH... yes (-Wl,-rpath -Wl,LIBDIR + )
checking if linker supports RUNPATH... yes (-Wl,--enable-new-dtags)
checking for OPAL CPPFLAGS...    
checking for OPAL CFLAGS... -pthread 
checking for OPAL CFLAGS_PREFIX... 
checking for OPAL CXXFLAGS... -pthread 
checking for OPAL CXXFLAGS_PREFIX... 
checking for OPAL LDFLAGS...     -Wl,-rpath -Wl,@{libdir} -Wl,--enable-new-dtags
checking for OPAL pkg-config LDFLAGS...     -Wl,-rpath -Wl,${libdir} -Wl,--enable-new-dtags
checking for OPAL LIBS... -lm -ldl -lutil 
checking for ORTE CPPFLAGS...    
checking for ORTE CFLAGS... -pthread 
checking for ORTE CFLAGS_PREFIX... 
checking for ORTE LDFLAGS...     -Wl,-rpath -Wl,@{libdir} -Wl,--enable-new-dtags
checking for ORTE pkg-config LDFLAGS...     -Wl,-rpath -Wl,${libdir} -Wl,--enable-new-dtags
checking for ORTE LIBS... -lm -ldl -lutil 
checking for OMPI CPPFLAGS...    
checking for OMPI CFLAGS... -pthread 
checking for OMPI CFLAGS_PREFIX... 
checking for OMPI CXXFLAGS... -pthread 
checking for OMPI CXXFLAGS_PREFIX... 
checking for OMPI FCFLAGS...  
checking for OMPI FCFLAGS_PREFIX... 
checking for OMPI LDFLAGS...     -Wl,-rpath -Wl,@{libdir} -Wl,--enable-new-dtags
checking for OMPI pkg-config LDFLAGS...     -Wl,-rpath -Wl,${libdir} -Wl,--enable-new-dtags
checking for OMPI LIBS... -lm -ldl -lutil  -lrt
checking if libtool needs -no-undefined flag to build shared libraries... no

*** Final output
checking for libraries that use libnl v1... (none)
checking for libraries that use libnl v3... (none)
checking that generated files are newer than configure... done
configure: creating ./config.status
config.status: creating ompi/include/ompi/version.h
config.status: creating orte/include/orte/version.h
config.status: creating oshmem/include/oshmem/version.h
config.status: creating opal/include/opal/version.h
config.status: creating ompi/mpi/java/Makefile
config.status: creating ompi/mpi/java/java/Makefile
config.status: creating ompi/mpi/java/c/Makefile
config.status: creating ompi/mpi/fortran/configure-fortran-output.h
config.status: creating opal/mca/hwloc/Makefile
config.status: creating opal/mca/hwloc/external/Makefile
config.status: creating opal/mca/hwloc/hwloc201/Makefile
config.status: creating opal/mca/hwloc/hwloc201/hwloc/Makefile
config.status: creating opal/mca/hwloc/hwloc201/hwloc/include/Makefile
config.status: creating opal/mca/hwloc/hwloc201/hwloc/hwloc/Makefile
config.status: creating opal/mca/common/Makefile
config.status: creating opal/mca/common/cuda/Makefile
config.status: creating opal/mca/common/sm/Makefile
config.status: creating opal/mca/common/ucx/Makefile
config.status: creating opal/mca/common/verbs/Makefile
config.status: creating opal/mca/common/verbs_usnic/Makefile
config.status: creating opal/mca/allocator/Makefile
config.status: creating opal/mca/allocator/basic/Makefile
config.status: creating opal/mca/allocator/bucket/Makefile
config.status: creating opal/mca/backtrace/Makefile
config.status: creating opal/mca/backtrace/execinfo/Makefile
config.status: creating opal/mca/backtrace/printstack/Makefile
config.status: creating opal/mca/backtrace/none/Makefile
config.status: creating opal/mca/btl/Makefile
config.status: creating opal/mca/btl/self/Makefile
config.status: creating opal/mca/btl/openib/Makefile
config.status: creating opal/mca/btl/portals4/Makefile
config.status: creating opal/mca/btl/sm/Makefile
config.status: creating opal/mca/btl/smcuda/Makefile
config.status: creating opal/mca/btl/tcp/Makefile
config.status: creating opal/mca/btl/uct/Makefile
config.status: creating opal/mca/btl/ugni/Makefile
config.status: creating opal/mca/btl/usnic/Makefile
config.status: creating opal/mca/btl/vader/Makefile
config.status: creating opal/mca/compress/Makefile
config.status: creating opal/mca/compress/bzip/Makefile
config.status: creating opal/mca/compress/gzip/Makefile
config.status: creating opal/mca/crs/Makefile
config.status: creating opal/mca/crs/none/Makefile
config.status: creating opal/mca/crs/self/Makefile
config.status: creating opal/mca/dl/Makefile
config.status: creating opal/mca/dl/dlopen/Makefile
config.status: creating opal/mca/dl/libltdl/Makefile
config.status: creating opal/mca/event/Makefile
config.status: creating opal/mca/event/external/Makefile
config.status: creating opal/mca/event/libevent2022/Makefile
config.status: creating opal/mca/if/Makefile
config.status: creating opal/mca/if/bsdx_ipv4/Makefile
config.status: creating opal/mca/if/bsdx_ipv6/Makefile
config.status: creating opal/mca/if/linux_ipv6/Makefile
config.status: creating opal/mca/if/posix_ipv4/Makefile
config.status: creating opal/mca/if/solaris_ipv6/Makefile
config.status: creating opal/mca/installdirs/Makefile
config.status: creating opal/mca/installdirs/env/Makefile
config.status: creating opal/mca/installdirs/config/Makefile
config.status: creating opal/mca/installdirs/config/install_dirs.h
config.status: creating opal/mca/memchecker/Makefile
config.status: creating opal/mca/memchecker/valgrind/Makefile
config.status: creating opal/mca/memcpy/Makefile
config.status: creating opal/mca/memory/Makefile
config.status: creating opal/mca/memory/patcher/Makefile
config.status: creating opal/mca/memory/malloc_solaris/Makefile
config.status: creating opal/mca/mpool/Makefile
config.status: creating opal/mca/mpool/hugepage/Makefile
config.status: creating opal/mca/mpool/memkind/Makefile
config.status: creating opal/mca/patcher/Makefile
config.status: creating opal/mca/patcher/linux/Makefile
config.status: creating opal/mca/patcher/overwrite/Makefile
config.status: creating opal/mca/pmix/Makefile
config.status: creating opal/mca/pmix/isolated/Makefile
config.status: creating opal/mca/pmix/cray/Makefile
config.status: creating opal/mca/pmix/ext1x/Makefile
config.status: creating opal/mca/pmix/ext2x/Makefile
config.status: creating opal/mca/pmix/ext3x/Makefile
config.status: creating opal/mca/pmix/flux/Makefile
config.status: creating opal/mca/pmix/pmix3x/Makefile
config.status: creating opal/mca/pmix/s1/Makefile
config.status: creating opal/mca/pmix/s2/Makefile
config.status: creating opal/mca/pstat/Makefile
config.status: creating opal/mca/pstat/linux/Makefile
config.status: creating opal/mca/pstat/test/Makefile
config.status: creating opal/mca/rcache/Makefile
config.status: creating opal/mca/rcache/grdma/Makefile
config.status: creating opal/mca/rcache/gpusm/Makefile
config.status: creating opal/mca/rcache/rgpusm/Makefile
config.status: creating opal/mca/rcache/udreg/Makefile
config.status: creating opal/mca/reachable/Makefile
config.status: creating opal/mca/reachable/weighted/Makefile
config.status: creating opal/mca/reachable/netlink/Makefile
config.status: creating opal/mca/shmem/Makefile
config.status: creating opal/mca/shmem/mmap/Makefile
config.status: creating opal/mca/shmem/posix/Makefile
config.status: creating opal/mca/shmem/sysv/Makefile
config.status: creating opal/mca/timer/Makefile
config.status: creating opal/mca/timer/altix/Makefile
config.status: creating opal/mca/timer/darwin/Makefile
config.status: creating opal/mca/timer/linux/Makefile
config.status: creating opal/mca/timer/solaris/Makefile
config.status: creating orte/mca/common/Makefile
config.status: creating orte/mca/common/alps/Makefile
config.status: creating orte/mca/errmgr/Makefile
config.status: creating orte/mca/errmgr/default_app/Makefile
config.status: creating orte/mca/errmgr/default_hnp/Makefile
config.status: creating orte/mca/errmgr/default_orted/Makefile
config.status: creating orte/mca/errmgr/default_tool/Makefile
config.status: creating orte/mca/ess/Makefile
config.status: creating orte/mca/ess/env/Makefile
config.status: creating orte/mca/ess/hnp/Makefile
config.status: creating orte/mca/ess/pmi/Makefile
config.status: creating orte/mca/ess/singleton/Makefile
config.status: creating orte/mca/ess/tool/Makefile
config.status: creating orte/mca/ess/alps/Makefile
config.status: creating orte/mca/ess/lsf/Makefile
config.status: creating orte/mca/ess/slurm/Makefile
config.status: creating orte/mca/ess/tm/Makefile
config.status: creating orte/mca/filem/Makefile
config.status: creating orte/mca/filem/raw/Makefile
config.status: creating orte/mca/grpcomm/Makefile
config.status: creating orte/mca/grpcomm/direct/Makefile
config.status: creating orte/mca/iof/Makefile
config.status: creating orte/mca/iof/hnp/Makefile
config.status: creating orte/mca/iof/orted/Makefile
config.status: creating orte/mca/iof/tool/Makefile
config.status: creating orte/mca/odls/Makefile
config.status: creating orte/mca/odls/alps/Makefile
config.status: creating orte/mca/odls/default/Makefile
config.status: creating orte/mca/odls/pspawn/Makefile
config.status: creating orte/mca/oob/Makefile
config.status: creating orte/mca/oob/alps/Makefile
config.status: creating orte/mca/oob/tcp/Makefile
config.status: creating orte/mca/plm/Makefile
config.status: creating orte/mca/plm/alps/Makefile
config.status: creating orte/mca/plm/isolated/Makefile
config.status: creating orte/mca/plm/lsf/Makefile
config.status: creating orte/mca/plm/rsh/Makefile
config.status: creating orte/mca/plm/slurm/Makefile
config.status: creating orte/mca/plm/tm/Makefile
config.status: creating orte/mca/ras/Makefile
config.status: creating orte/mca/ras/simulator/Makefile
config.status: creating orte/mca/ras/alps/Makefile
config.status: creating orte/mca/ras/gridengine/Makefile
config.status: creating orte/mca/ras/lsf/Makefile
config.status: creating orte/mca/ras/slurm/Makefile
config.status: creating orte/mca/ras/tm/Makefile
config.status: creating orte/mca/regx/Makefile
config.status: creating orte/mca/regx/fwd/Makefile
config.status: creating orte/mca/regx/naive/Makefile
config.status: creating orte/mca/regx/reverse/Makefile
config.status: creating orte/mca/rmaps/Makefile
config.status: creating orte/mca/rmaps/mindist/Makefile
config.status: creating orte/mca/rmaps/ppr/Makefile
config.status: creating orte/mca/rmaps/rank_file/Makefile
config.status: creating orte/mca/rmaps/resilient/Makefile
config.status: creating orte/mca/rmaps/round_robin/Makefile
config.status: creating orte/mca/rmaps/seq/Makefile
config.status: creating orte/mca/rml/Makefile
config.status: creating orte/mca/rml/oob/Makefile
config.status: creating orte/mca/routed/Makefile
config.status: creating orte/mca/routed/binomial/Makefile
config.status: creating orte/mca/routed/direct/Makefile
config.status: creating orte/mca/routed/radix/Makefile
config.status: creating orte/mca/rtc/Makefile
config.status: creating orte/mca/rtc/hwloc/Makefile
config.status: creating orte/mca/schizo/Makefile
config.status: creating orte/mca/schizo/flux/Makefile
config.status: creating orte/mca/schizo/ompi/Makefile
config.status: creating orte/mca/schizo/orte/Makefile
config.status: creating orte/mca/schizo/alps/Makefile
config.status: creating orte/mca/schizo/moab/Makefile
config.status: creating orte/mca/schizo/singularity/Makefile
config.status: creating orte/mca/schizo/slurm/Makefile
config.status: creating orte/mca/snapc/Makefile
config.status: creating orte/mca/snapc/full/Makefile
config.status: creating orte/mca/sstore/Makefile
config.status: creating orte/mca/sstore/central/Makefile
config.status: creating orte/mca/sstore/stage/Makefile
config.status: creating orte/mca/state/Makefile
config.status: creating orte/mca/state/app/Makefile
config.status: creating orte/mca/state/hnp/Makefile
config.status: creating orte/mca/state/novm/Makefile
config.status: creating orte/mca/state/orted/Makefile
config.status: creating orte/mca/state/tool/Makefile
 ---> Removed intermediate container 56d9a27141e1
 ---> f5551681735b
Step 6/18 : RUN KEYDUMP_URL=https://cloud.cees.ornl.gov/download &&     KEYDUMP_FILE=keydump &&     wget --quiet ${KEYDUMP_URL}/${KEYDUMP_FILE} &&     wget --quiet ${KEYDUMP_URL}/${KEYDUMP_FILE}.sig &&     gpg --import ${KEYDUMP_FILE} &&     gpg --verify ${KEYDUMP_FILE}.sig ${KEYDUMP_FILE} &&     rm ${KEYDUMP_FILE}*
 ---> Running in f60e382401e0
config.status: creating ompi/mca/common/Makefile
config.status: creating ompi/mca/common/monitoring/Makefile
config.status: creating ompi/mca/common/ompio/Makefile
config.status: creating ompi/mca/bml/Makefile
config.status: creating ompi/mca/bml/r2/Makefile
config.status: creating ompi/mca/coll/Makefile
config.status: creating ompi/mca/coll/basic/Makefile
config.status: creating ompi/mca/coll/inter/Makefile
config.status: creating ompi/mca/coll/libnbc/Makefile
config.status: creating ompi/mca/coll/self/Makefile
config.status: creating ompi/mca/coll/sm/Makefile
config.status: creating ompi/mca/coll/sync/Makefile
config.status: creating ompi/mca/coll/tuned/Makefile
config.status: creating ompi/mca/coll/cuda/Makefile
config.status: creating ompi/mca/coll/fca/Makefile
config.status: creating ompi/mca/coll/hcoll/Makefile
config.status: creating ompi/mca/coll/monitoring/Makefile
config.status: creating ompi/mca/coll/portals4/Makefile
config.status: creating ompi/mca/crcp/Makefile
config.status: creating ompi/mca/crcp/bkmrk/Makefile
config.status: creating ompi/mca/fbtl/Makefile
config.status: creating ompi/mca/fbtl/posix/Makefile
gpg: directory '/root/.gnupg' created
gpg: keybox '/root/.gnupg/pubring.kbx' created
gpg: /root/.gnupg/trustdb.gpg: trustdb created
gpg: key 48822FDA51C1DA7A: public key "Damien Lebrun-Grandie <dalg24@gmail.com>" imported
gpg: key A2C794A986419D8A: public key "Tom Stellard <tstellar@redhat.com>" imported
gpg: key 0FC3042E345AD05D: public key "Hans Wennborg <hans@chromium.org>" imported
gpg: key EC8FEF3A7BFB4EDA: 24 signatures not checked due to missing keys
gpg: key EC8FEF3A7BFB4EDA: public key "Brad King" imported
gpg: key 379CE192D401AB61: public key "Bintray (by JFrog) <bintray@bintray.com>" imported
gpg: Total number processed: 5
gpg:               imported: 5
gpg: no ultimately trusted keys found
gpg: Signature made Thu May  7 23:44:59 2020 UTC
gpg:                using RSA key 061CFF3BA41AA45D25BCE7097A0994F834C86684
gpg: Good signature from "Damien Lebrun-Grandie <dalg24@gmail.com>" [unknown]
gpg: WARNING: This key is not certified with a trusted signature!
gpg:          There is no indication that the signature belongs to the owner.
Primary key fingerprint: E226 98C7 0BF0 7BDA 37E1  4154 4882 2FDA 51C1 DA7A
     Subkey fingerprint: 061C FF3B A41A A45D 25BC  E709 7A09 94F8 34C8 6684
config.status: creating ompi/mca/fbtl/pvfs2/Makefile
config.status: creating ompi/mca/fcoll/Makefile
config.status: creating ompi/mca/fcoll/dynamic/Makefile
config.status: creating ompi/mca/fcoll/dynamic_gen2/Makefile
config.status: creating ompi/mca/fcoll/individual/Makefile
config.status: creating ompi/mca/fcoll/two_phase/Makefile
config.status: creating ompi/mca/fcoll/vulcan/Makefile
config.status: creating ompi/mca/fs/Makefile
config.status: creating ompi/mca/fs/lustre/Makefile
config.status: creating ompi/mca/fs/pvfs2/Makefile
config.status: creating ompi/mca/fs/ufs/Makefile
config.status: creating ompi/mca/hook/Makefile
config.status: creating ompi/mca/io/Makefile
config.status: creating ompi/mca/io/ompio/Makefile
config.status: creating ompi/mca/io/romio321/Makefile
config.status: creating ompi/mca/mtl/Makefile
config.status: creating ompi/mca/mtl/ofi/Makefile
config.status: creating ompi/mca/mtl/portals4/Makefile
config.status: creating ompi/mca/mtl/psm/Makefile
config.status: creating ompi/mca/mtl/psm2/Makefile
config.status: creating ompi/mca/op/Makefile
config.status: creating ompi/mca/osc/Makefile
config.status: creating ompi/mca/osc/sm/Makefile
config.status: creating ompi/mca/osc/monitoring/Makefile
config.status: creating ompi/mca/osc/portals4/Makefile
config.status: creating ompi/mca/osc/pt2pt/Makefile
config.status: creating ompi/mca/osc/rdma/Makefile
config.status: creating ompi/mca/osc/ucx/Makefile
config.status: creating ompi/mca/pml/Makefile
config.status: creating ompi/mca/pml/cm/Makefile
config.status: creating ompi/mca/pml/crcpw/Makefile
config.status: creating ompi/mca/pml/monitoring/Makefile
config.status: creating ompi/mca/pml/ob1/Makefile
config.status: creating ompi/mca/pml/ucx/Makefile
config.status: creating ompi/mca/pml/v/Makefile
config.status: creating ompi/mca/pml/yalla/Makefile
config.status: creating ompi/mca/rte/Makefile
config.status: creating ompi/mca/rte/pmix/Makefile
config.status: creating ompi/mca/rte/orte/Makefile
config.status: creating ompi/mca/sharedfp/Makefile
config.status: creating ompi/mca/sharedfp/individual/Makefile
config.status: creating ompi/mca/sharedfp/lockedfile/Makefile
config.status: creating ompi/mca/sharedfp/sm/Makefile
config.status: creating ompi/mca/topo/Makefile
config.status: creating ompi/mca/topo/basic/Makefile
config.status: creating ompi/mca/topo/treematch/Makefile
config.status: creating ompi/mca/vprotocol/Makefile
config.status: creating ompi/mca/vprotocol/pessimist/Makefile
config.status: creating oshmem/mca/atomic/Makefile
config.status: creating oshmem/mca/atomic/basic/Makefile
config.status: creating oshmem/mca/atomic/mxm/Makefile
config.status: creating oshmem/mca/atomic/ucx/Makefile
config.status: creating oshmem/mca/memheap/Makefile
config.status: creating oshmem/mca/memheap/buddy/Makefile
config.status: creating oshmem/mca/memheap/ptmalloc/Makefile
config.status: creating oshmem/mca/scoll/Makefile
config.status: creating oshmem/mca/scoll/basic/Makefile
config.status: creating oshmem/mca/scoll/mpi/Makefile
config.status: creating oshmem/mca/scoll/fca/Makefile
config.status: creating oshmem/mca/spml/Makefile
config.status: creating oshmem/mca/spml/ikrit/Makefile
config.status: creating oshmem/mca/spml/ucx/Makefile
config.status: creating oshmem/mca/sshmem/Makefile
config.status: creating oshmem/mca/sshmem/mmap/Makefile
config.status: creating oshmem/mca/sshmem/sysv/Makefile
config.status: creating oshmem/mca/sshmem/ucx/Makefile
config.status: creating oshmem/mca/sshmem/verbs/Makefile
config.status: creating ompi/mpiext/affinity/Makefile
config.status: creating ompi/mpiext/affinity/c/Makefile
config.status: creating ompi/mpiext/cr/Makefile
config.status: creating ompi/mpiext/cr/c/Makefile
config.status: creating ompi/mpiext/cuda/Makefile
config.status: creating ompi/mpiext/cuda/c/Makefile
config.status: creating ompi/mpiext/pcollreq/Makefile
config.status: creating ompi/mpiext/pcollreq/c/Makefile
config.status: creating ompi/mpiext/pcollreq/c/profile/Makefile
config.status: creating ompi/mpiext/pcollreq/mpif-h/Makefile
config.status: creating ompi/mpiext/pcollreq/mpif-h/profile/Makefile
config.status: creating ompi/mpiext/pcollreq/use-mpi/Makefile
config.status: creating ompi/mpiext/pcollreq/use-mpi-f08/Makefile
config.status: creating ompi/contrib/libompitrace/Makefile
config.status: creating Makefile
config.status: creating config/Makefile
config.status: creating contrib/Makefile
config.status: creating contrib/dist/mofed/debian/changelog
config.status: creating contrib/dist/mofed/debian/control
config.status: creating contrib/dist/mofed/debian/copyright
config.status: creating test/Makefile
config.status: creating test/event/Makefile
config.status: creating test/asm/Makefile
config.status: creating test/datatype/Makefile
config.status: creating test/dss/Makefile
config.status: creating test/class/Makefile
config.status: creating test/mpool/Makefile
config.status: creating test/support/Makefile
config.status: creating test/threads/Makefile
config.status: creating test/util/Makefile
config.status: creating test/monitoring/Makefile
config.status: creating test/spc/Makefile
config.status: creating contrib/dist/mofed/debian/rules
config.status: creating contrib/dist/mofed/compile_debian_mlnx_example
config.status: creating opal/Makefile
config.status: creating opal/etc/Makefile
config.status: creating opal/include/Makefile
config.status: creating opal/datatype/Makefile
config.status: creating opal/util/Makefile
config.status: creating opal/util/keyval/Makefile
config.status: creating opal/mca/base/Makefile
config.status: creating opal/tools/wrappers/Makefile
config.status: creating opal/tools/wrappers/opalcc-wrapper-data.txt
config.status: creating opal/tools/wrappers/opalc++-wrapper-data.txt
config.status: creating opal/tools/wrappers/opal.pc
config.status: creating opal/tools/opal-checkpoint/Makefile
config.status: creating opal/tools/opal-restart/Makefile
config.status: creating orte/Makefile
config.status: creating orte/include/Makefile
config.status: creating orte/etc/Makefile
config.status: creating orte/tools/orted/Makefile
config.status: creating orte/tools/orterun/Makefile
config.status: creating orte/tools/wrappers/Makefile
config.status: creating orte/tools/wrappers/ortecc-wrapper-data.txt
config.status: creating orte/tools/wrappers/orte.pc
config.status: creating orte/tools/orte-clean/Makefile
config.status: creating orte/tools/orte-info/Makefile
config.status: creating orte/tools/orte-server/Makefile
config.status: creating ompi/Makefile
config.status: creating ompi/etc/Makefile
config.status: creating ompi/include/Makefile
config.status: creating ompi/include/mpif.h
config.status: creating ompi/include/mpif-config.h
config.status: creating ompi/datatype/Makefile
config.status: creating ompi/debuggers/Makefile
config.status: creating ompi/mpi/c/Makefile
config.status: creating ompi/mpi/c/profile/Makefile
config.status: creating ompi/mpi/cxx/Makefile
config.status: creating ompi/mpi/fortran/base/Makefile
config.status: creating ompi/mpi/fortran/mpif-h/Makefile
config.status: creating ompi/mpi/fortran/mpif-h/profile/Makefile
config.status: creating ompi/mpi/fortran/use-mpi-tkr/Makefile
config.status: creating ompi/mpi/fortran/use-mpi-tkr/fortran_sizes.h
config.status: creating ompi/mpi/fortran/use-mpi-tkr/fortran_kinds.sh
config.status: creating ompi/mpi/fortran/use-mpi-ignore-tkr/Makefile
config.status: creating ompi/mpi/fortran/use-mpi-ignore-tkr/mpi-ignore-tkr-interfaces.h
config.status: creating ompi/mpi/fortran/use-mpi-ignore-tkr/mpi-ignore-tkr-file-interfaces.h
config.status: creating ompi/mpi/fortran/use-mpi-ignore-tkr/mpi-ignore-tkr-removed-interfaces.h
config.status: creating ompi/mpi/fortran/use-mpi-f08/Makefile
config.status: creating ompi/mpi/fortran/use-mpi-f08/bindings/Makefile
config.status: creating ompi/mpi/fortran/use-mpi-f08/mod/Makefile
config.status: creating ompi/mpi/fortran/mpiext-use-mpi/Makefile
config.status: creating ompi/mpi/fortran/mpiext-use-mpi-f08/Makefile
config.status: creating ompi/mpi/tool/Makefile
config.status: creating ompi/mpi/tool/profile/Makefile
config.status: creating ompi/tools/ompi_info/Makefile
config.status: creating ompi/tools/wrappers/Makefile
config.status: creating ompi/tools/wrappers/mpicc-wrapper-data.txt
config.status: creating ompi/tools/wrappers/mpic++-wrapper-data.txt
config.status: creating ompi/tools/wrappers/mpifort-wrapper-data.txt
config.status: creating ompi/tools/wrappers/ompi.pc
config.status: creating ompi/tools/wrappers/ompi-c.pc
config.status: creating ompi/tools/wrappers/ompi-cxx.pc
config.status: creating ompi/tools/wrappers/ompi-fort.pc
config.status: creating ompi/tools/wrappers/mpijavac.pl
config.status: creating ompi/tools/mpisync/Makefile
config.status: creating oshmem/Makefile
config.status: creating oshmem/include/Makefile
config.status: creating oshmem/shmem/c/Makefile
config.status: creating oshmem/shmem/c/profile/Makefile
config.status: creating oshmem/shmem/fortran/Makefile
config.status: creating oshmem/shmem/fortran/profile/Makefile
config.status: creating oshmem/tools/oshmem_info/Makefile
config.status: creating oshmem/tools/wrappers/Makefile
config.status: creating oshmem/tools/wrappers/shmemcc-wrapper-data.txt
config.status: creating oshmem/tools/wrappers/shmemc++-wrapper-data.txt
config.status: creating oshmem/tools/wrappers/shmemfort-wrapper-data.txt
config.status: creating opal/include/opal_config.h
config.status: creating ompi/include/mpi.h
config.status: creating oshmem/include/shmem.h
config.status: creating opal/mca/hwloc/hwloc201/hwloc/include/private/autogen/config.h
config.status: creating opal/mca/hwloc/hwloc201/hwloc/include/hwloc/autogen/config.h
config.status: creating ompi/mpiext/cuda/c/mpiext_cuda_c.h
config.status: executing depfiles commands
 ---> Removed intermediate container f60e382401e0
 ---> e9404319edd4
Step 7/18 : ARG CMAKE_VERSION=3.26.3
 ---> Running in 30eebf0681b6
config.status: executing opal/mca/event/libevent2022/libevent/include/event2/event-config.h commands
config.status: executing ompi/mca/osc/monitoring/osc_monitoring_template_gen.h commands
config.status: executing libtool commands

Open MPI configuration:
-----------------------
Version: 4.0.2
Build MPI C bindings: yes
Build MPI C++ bindings (deprecated): no
Build MPI Fortran bindings: no
MPI Build Java bindings (experimental): no
Build Open SHMEM support: false (no spml)
Debug build: no
Platform file: (none)

Miscellaneous
-----------------------
CUDA support: yes
HWLOC support: internal
Libevent support: internal
PMIx support: Internal
 
Transports
-----------------------
Cisco usNIC: no
Cray uGNI (Gemini/Aries): no
Intel Omnipath (PSM2): no
Intel TrueScale (PSM): no
Mellanox MXM: no
Open UCX: no
OpenFabrics OFI Libfabric: no
OpenFabrics Verbs: no
Portals4: no
Shared memory/copy in+copy out: yes
Shared memory/Linux CMA: yes
Shared memory/Linux KNEM: no
Shared memory/XPMEM: no
TCP: yes
 
Resource Managers
-----------------------
Cray Alps: no
Grid Engine: no
LSF: no
Moab: no
Slurm: yes
ssh/rsh: yes
Torque: no
 
OMPIO File Systems
-----------------------
Generic Unix FS: yes
Lustre: no
PVFS2/OrangeFS: no
 
Making install in config
make[1]: Entering directory '/scratch/build/config'
make[2]: Entering directory '/scratch/build/config'
make[2]: Nothing to be done for 'install-exec-am'.
make[2]: Nothing to be done for 'install-data-am'.
make[2]: Leaving directory '/scratch/build/config'
make[1]: Leaving directory '/scratch/build/config'
Making install in contrib
make[1]: Entering directory '/scratch/build/contrib'
make[2]: Entering directory '/scratch/build/contrib'
make[2]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi/amca-param-sets'
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../openmpi/contrib/openmpi-valgrind.supp '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../openmpi/contrib/amca-param-sets/example.conf '/opt/openmpi/share/openmpi/amca-param-sets'
make[2]: Leaving directory '/scratch/build/contrib'
make[1]: Leaving directory '/scratch/build/contrib'
Making install in opal
make[1]: Entering directory '/scratch/build/opal'
Making install in include
make[2]: Entering directory '/scratch/build/opal/include'
make[3]: Entering directory '/scratch/build/opal/include'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Leaving directory '/scratch/build/opal/include'
make[2]: Leaving directory '/scratch/build/opal/include'
Making install in datatype
make[2]: Entering directory '/scratch/build/opal/datatype'
  CC       libdatatype_reliable_la-opal_datatype_pack.lo
  CC       libdatatype_reliable_la-opal_datatype_unpack.lo
  CC       opal_convertor.lo
  CC       opal_convertor_raw.lo
  CC       opal_copy_functions.lo
  CC       opal_copy_functions_heterogeneous.lo
  CC       opal_datatype_add.lo
  CC       opal_datatype_clone.lo
  CC       opal_datatype_copy.lo
  CC       opal_datatype_create.lo
  CC       opal_datatype_create_contiguous.lo
  CC       opal_datatype_destroy.lo
  CC       opal_datatype_dump.lo
  CC       opal_datatype_fake_stack.lo
  CC       opal_datatype_get_count.lo
  CC       opal_datatype_module.lo
  CC       opal_datatype_monotonic.lo
  CC       opal_datatype_optimize.lo
  CC       opal_datatype_pack.lo
  CC       opal_datatype_position.lo
  CC       opal_datatype_resize.lo
  CC       opal_datatype_unpack.lo
  CC       opal_datatype_cuda.lo
  CCLD     libdatatype_reliable.la
ar: `u' modifier ignored since `D' is the default (see `U')
  CCLD     libdatatype.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/opal/datatype'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Leaving directory '/scratch/build/opal/datatype'
make[2]: Leaving directory '/scratch/build/opal/datatype'
Making install in etc
make[2]: Entering directory '/scratch/build/opal/etc'
make[3]: Entering directory '/scratch/build/opal/etc'
make[3]: Nothing to be done for 'install-exec-am'.
/usr/bin/mkdir -p /opt/openmpi/etc
 /usr/bin/install -c -m 644 ../../../openmpi/opal/etc/openmpi-mca-params.conf /opt/openmpi/etc/openmpi-mca-params.conf
make[3]: Leaving directory '/scratch/build/opal/etc'
make[2]: Leaving directory '/scratch/build/opal/etc'
Making install in util
make[2]: Entering directory '/scratch/build/opal/util'
Making install in keyval
make[3]: Entering directory '/scratch/build/opal/util/keyval'
  CC       keyval_lex.lo
  CCLD     libopalutilkeyval.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[4]: Entering directory '/scratch/build/opal/util/keyval'
make[4]: Nothing to be done for 'install-exec-am'.
make[4]: Nothing to be done for 'install-data-am'.
make[4]: Leaving directory '/scratch/build/opal/util/keyval'
make[3]: Leaving directory '/scratch/build/opal/util/keyval'
make[3]: Entering directory '/scratch/build/opal/util'
  CC       alfg.lo
  CC       arch.lo
  CC       argv.lo
  CC       basename.lo
  CC       bipartite_graph.lo
  CC       cmd_line.lo
  CC       crc.lo
  CC       daemon_init.lo
  CC       ethtool.lo
  CC       error.lo
  CC       fd.lo
  CC       few.lo
  CC       if.lo
  CC       keyval_parse.lo
  CC       malloc.lo
  CC       net.lo
  CC       numtostr.lo
  CC       opal_environ.lo
  CC       opal_getcwd.lo
  CC       opal_pty.lo
  CC       os_dirpath.lo
  CC       os_path.lo
  CC       output.lo
  CC       path.lo
  CC       printf.lo
  CC       proc.lo
  CC       qsort.lo
  CC       show_help.lo
  CC       show_help_lex.lo
  CC       stacktrace.lo
  CC       strncpy.lo
  CC       sys_limits.lo
  CC       uri.lo
  CC       info_subscriber.lo
  CC       info.lo
  CCLD     libopalutil.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[4]: Entering directory '/scratch/build/opal/util'
make[4]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../../openmpi/opal/util/help-opal-util.txt '/opt/openmpi/share/openmpi'
make[4]: Leaving directory '/scratch/build/opal/util'
make[3]: Leaving directory '/scratch/build/opal/util'
make[2]: Leaving directory '/scratch/build/opal/util'
Making install in mca/base
make[2]: Entering directory '/scratch/build/opal/mca/base'
  CC       mca_base_close.lo
  CC       mca_base_cmd_line.lo
  CC       mca_base_component_compare.lo
  CC       mca_base_component_find.lo
  CC       mca_base_component_repository.lo
  CC       mca_base_components_open.lo
  CC       mca_base_components_close.lo
  CC       mca_base_components_select.lo
  CC       mca_base_list.lo
  CC       mca_base_open.lo
  CC       mca_base_var.lo
  CC       mca_base_pvar.lo
  CC       mca_base_var_enum.lo
  CC       mca_base_var_group.lo
  CC       mca_base_parse_paramfile.lo
  CC       mca_base_components_register.lo
  CC       mca_base_framework.lo
  CCLD     libmca_base.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/opal/mca/base'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../../../openmpi/opal/mca/base/help-mca-base.txt ../../../../openmpi/opal/mca/base/help-mca-var.txt '/opt/openmpi/share/openmpi'
make[3]: Leaving directory '/scratch/build/opal/mca/base'
make[2]: Leaving directory '/scratch/build/opal/mca/base'
Making install in mca/common
make[2]: Entering directory '/scratch/build/opal/mca/common'
make[3]: Entering directory '/scratch/build/opal/mca/common'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Nothing to be done for 'install-data-am'.
make[3]: Leaving directory '/scratch/build/opal/mca/common'
make[2]: Leaving directory '/scratch/build/opal/mca/common'
Making install in mca/allocator
make[2]: Entering directory '/scratch/build/opal/mca/allocator'
  CC       base/allocator_base_frame.lo
  CCLD     libmca_allocator.la
ar: `u' modifier ignored since `D' is the default (see `U')
 ---> Removed intermediate container 30eebf0681b6
 ---> 91c5c6b3a7bf
Step 8/18 : ENV CMAKE_DIR=/opt/cmake
 ---> Running in 20c1195e80a8
make[3]: Entering directory '/scratch/build/opal/mca/allocator'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Leaving directory '/scratch/build/opal/mca/allocator'
make[2]: Leaving directory '/scratch/build/opal/mca/allocator'
Making install in mca/backtrace
make[2]: Entering directory '/scratch/build/opal/mca/backtrace'
  CC       base/backtrace_component.lo
  CCLD     libmca_backtrace.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/opal/mca/backtrace'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Leaving directory '/scratch/build/opal/mca/backtrace'
make[2]: Leaving directory '/scratch/build/opal/mca/backtrace'
Making install in mca/btl
make[2]: Entering directory '/scratch/build/opal/mca/btl'
  CC       base/btl_base_frame.lo
  CC       base/btl_base_error.lo
  CC       base/btl_base_select.lo
  CC       base/btl_base_mca.lo
  CCLD     libmca_btl.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/opal/mca/btl'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../../../openmpi/opal/mca/btl/base/help-mpi-btl-base.txt '/opt/openmpi/share/openmpi'
make[3]: Leaving directory '/scratch/build/opal/mca/btl'
make[2]: Leaving directory '/scratch/build/opal/mca/btl'
Making install in mca/compress
make[2]: Entering directory '/scratch/build/opal/mca/compress'
  CC       base/compress_base_open.lo
  CC       base/compress_base_close.lo
  CC       base/compress_base_select.lo
  CC       base/compress_base_fns.lo
  CCLD     libmca_compress.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/opal/mca/compress'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Leaving directory '/scratch/build/opal/mca/compress'
make[2]: Leaving directory '/scratch/build/opal/mca/compress'
Making install in mca/crs
make[2]: Entering directory '/scratch/build/opal/mca/crs'
  GENERATE opal_crs.7
  CC       base/crs_base_open.lo
  CC       base/crs_base_close.lo
  CC       base/crs_base_select.lo
  CC       base/crs_base_fns.lo
  CCLD     libmca_crs.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/opal/mca/crs'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/man/man7'
 /usr/bin/install -c -m 644 opal_crs.7 '/opt/openmpi/share/man/man7'
make[3]: Leaving directory '/scratch/build/opal/mca/crs'
make[2]: Leaving directory '/scratch/build/opal/mca/crs'
Making install in mca/dl
make[2]: Entering directory '/scratch/build/opal/mca/dl'
  CC       base/dl_base_close.lo
  CC       base/dl_base_open.lo
  CC       base/dl_base_fns.lo
  CC       base/dl_base_select.lo
  CCLD     libmca_dl.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/opal/mca/dl'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Leaving directory '/scratch/build/opal/mca/dl'
make[2]: Leaving directory '/scratch/build/opal/mca/dl'
Making install in mca/event
make[2]: Entering directory '/scratch/build/opal/mca/event'
  CC       base/event_base_frame.lo
  CCLD     libmca_event.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/opal/mca/event'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Leaving directory '/scratch/build/opal/mca/event'
make[2]: Leaving directory '/scratch/build/opal/mca/event'
Making install in mca/hwloc
make[2]: Entering directory '/scratch/build/opal/mca/hwloc'
  CC       base/hwloc_base_frame.lo
  CC       base/hwloc_base_dt.lo
  CC       base/hwloc_base_maffinity.lo
  CC       base/hwloc_base_util.lo
  CCLD     libmca_hwloc.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/opal/mca/hwloc'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../../../openmpi/opal/mca/hwloc/base/help-opal-hwloc-base.txt '/opt/openmpi/share/openmpi'
make[3]: Leaving directory '/scratch/build/opal/mca/hwloc'
make[2]: Leaving directory '/scratch/build/opal/mca/hwloc'
Making install in mca/if
make[2]: Entering directory '/scratch/build/opal/mca/if'
  CC       base/if_base_components.lo
  CCLD     libmca_if.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/opal/mca/if'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Leaving directory '/scratch/build/opal/mca/if'
make[2]: Leaving directory '/scratch/build/opal/mca/if'
Making install in mca/installdirs
make[2]: Entering directory '/scratch/build/opal/mca/installdirs'
  CC       base/installdirs_base_components.lo
  CC       base/installdirs_base_expand.lo
  CCLD     libmca_installdirs.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/opal/mca/installdirs'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Leaving directory '/scratch/build/opal/mca/installdirs'
make[2]: Leaving directory '/scratch/build/opal/mca/installdirs'
Making install in mca/memchecker
make[2]: Entering directory '/scratch/build/opal/mca/memchecker'
  CC       base/memchecker_base_open.lo
  CC       base/memchecker_base_select.lo
  CC       base/memchecker_base_wrappers.lo
  CCLD     libmca_memchecker.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/opal/mca/memchecker'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Leaving directory '/scratch/build/opal/mca/memchecker'
make[2]: Leaving directory '/scratch/build/opal/mca/memchecker'
Making install in mca/memcpy
make[2]: Entering directory '/scratch/build/opal/mca/memcpy'
  CC       base/memcpy_base_open.lo
  CCLD     libmca_memcpy.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/opal/mca/memcpy'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Leaving directory '/scratch/build/opal/mca/memcpy'
make[2]: Leaving directory '/scratch/build/opal/mca/memcpy'
Making install in mca/memory
make[2]: Entering directory '/scratch/build/opal/mca/memory'
  CC       base/memory_base_open.lo
  CC       base/memory_base_empty.lo
  CCLD     libmca_memory.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/opal/mca/memory'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Leaving directory '/scratch/build/opal/mca/memory'
make[2]: Leaving directory '/scratch/build/opal/mca/memory'
Making install in mca/mpool
make[2]: Entering directory '/scratch/build/opal/mca/mpool'
  CC       base/mpool_base_frame.lo
  CC       base/mpool_base_lookup.lo
  CC       base/mpool_base_alloc.lo
  CC       base/mpool_base_tree.lo
  CC       base/mpool_base_default.lo
  CC       base/mpool_base_basic.lo
  CCLD     libmca_mpool.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/opal/mca/mpool'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../../../openmpi/opal/mca/mpool/base/help-mpool-base.txt '/opt/openmpi/share/openmpi'
make[3]: Leaving directory '/scratch/build/opal/mca/mpool'
make[2]: Leaving directory '/scratch/build/opal/mca/mpool'
Making install in mca/patcher
make[2]: Entering directory '/scratch/build/opal/mca/patcher'
  CC       base/patcher_base_frame.lo
  CC       base/patcher_base_patch.lo
  CCLD     libmca_patcher.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/opal/mca/patcher'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Leaving directory '/scratch/build/opal/mca/patcher'
make[2]: Leaving directory '/scratch/build/opal/mca/patcher'
Making install in mca/pmix
make[2]: Entering directory '/scratch/build/opal/mca/pmix'
  CC       base/pmix_base_frame.lo
  CC       base/pmix_base_select.lo
  CC       base/pmix_base_hash.lo
  CC       base/pmix_base_fns.lo
  CCLD     libmca_pmix.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/opal/mca/pmix'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../../../openmpi/opal/mca/pmix/base/help-pmix-base.txt '/opt/openmpi/share/openmpi'
make[3]: Leaving directory '/scratch/build/opal/mca/pmix'
make[2]: Leaving directory '/scratch/build/opal/mca/pmix'
Making install in mca/pstat
make[2]: Entering directory '/scratch/build/opal/mca/pstat'
  CC       base/pstat_base_select.lo
  CC       base/pstat_base_open.lo
  CCLD     libmca_pstat.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/opal/mca/pstat'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Leaving directory '/scratch/build/opal/mca/pstat'
make[2]: Leaving directory '/scratch/build/opal/mca/pstat'
Making install in mca/rcache
make[2]: Entering directory '/scratch/build/opal/mca/rcache'
  CC       base/rcache_base_frame.lo
  CC       base/rcache_base_create.lo
  CC       base/rcache_base_vma_tree.lo
  CC       base/rcache_base_vma.lo
  CC       base/rcache_base_mem_cb.lo
  CCLD     libmca_rcache.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/opal/mca/rcache'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../../../openmpi/opal/mca/rcache/base/help-rcache-base.txt '/opt/openmpi/share/openmpi'
make[3]: Leaving directory '/scratch/build/opal/mca/rcache'
make[2]: Leaving directory '/scratch/build/opal/mca/rcache'
Making install in mca/reachable
make[2]: Entering directory '/scratch/build/opal/mca/reachable'
  CC       base/reachable_base_frame.lo
  CC       base/reachable_base_select.lo
  CC       base/reachable_base_alloc.lo
  CCLD     libmca_reachable.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/opal/mca/reachable'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Leaving directory '/scratch/build/opal/mca/reachable'
make[2]: Leaving directory '/scratch/build/opal/mca/reachable'
Making install in mca/shmem
make[2]: Entering directory '/scratch/build/opal/mca/shmem'
  CC       base/shmem_base_close.lo
  CC       base/shmem_base_select.lo
  CC       base/shmem_base_open.lo
  CC       base/shmem_base_wrappers.lo
  CCLD     libmca_shmem.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/opal/mca/shmem'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Leaving directory '/scratch/build/opal/mca/shmem'
make[2]: Leaving directory '/scratch/build/opal/mca/shmem'
Making install in mca/timer
make[2]: Entering directory '/scratch/build/opal/mca/timer'
  CC       base/timer_base_open.lo
  CCLD     libmca_timer.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/opal/mca/timer'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Leaving directory '/scratch/build/opal/mca/timer'
make[2]: Leaving directory '/scratch/build/opal/mca/timer'
Making install in mca/backtrace/execinfo
make[2]: Entering directory '/scratch/build/opal/mca/backtrace/execinfo'
  CC       backtrace_execinfo.lo
  CC       backtrace_execinfo_component.lo
  CCLD     libmca_backtrace_execinfo.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/opal/mca/backtrace/execinfo'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Nothing to be done for 'install-data-am'.
make[3]: Leaving directory '/scratch/build/opal/mca/backtrace/execinfo'
make[2]: Leaving directory '/scratch/build/opal/mca/backtrace/execinfo'
Making install in mca/dl/dlopen
make[2]: Entering directory '/scratch/build/opal/mca/dl/dlopen'
  CC       dl_dlopen_component.lo
  CC       dl_dlopen_module.lo
  CCLD     libmca_dl_dlopen.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/opal/mca/dl/dlopen'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Nothing to be done for 'install-data-am'.
make[3]: Leaving directory '/scratch/build/opal/mca/dl/dlopen'
make[2]: Leaving directory '/scratch/build/opal/mca/dl/dlopen'
Making install in mca/event/libevent2022
make[2]: Entering directory '/scratch/build/opal/mca/event/libevent2022'
Making install in libevent
make[3]: Entering directory '/scratch/build/opal/mca/event/libevent2022/libevent'
Making install in .
make[4]: Entering directory '/scratch/build/opal/mca/event/libevent2022/libevent'
depbase=`echo event.lo | sed 's|[^/]*$|.deps/&|;s|\.lo$||'`;\
/bin/bash ./libtool  --tag=CC   --mode=compile gcc -DHAVE_CONFIG_H -I. -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent  -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/compat -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/include -I./include  -I/scratch/openmpi -I/scratch/build -I/scratch/openmpi/opal/include -I/scratch/build/opal/mca/hwloc/hwloc201/hwloc/include -I/scratch/openmpi/opal/mca/hwloc/hwloc201/hwloc/include   -Drandom=opal_random  -w -Wall -fno-strict-aliasing -pthread -MT event.lo -MD -MP -MF $depbase.Tpo -c -o event.lo ../../../../../../openmpi/opal/mca/event/libevent2022/libevent/event.c &&\
mv -f $depbase.Tpo $depbase.Plo
depbase=`echo evthread.lo | sed 's|[^/]*$|.deps/&|;s|\.lo$||'`;\
/bin/bash ./libtool  --tag=CC   --mode=compile gcc -DHAVE_CONFIG_H -I. -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent  -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/compat -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/include -I./include  -I/scratch/openmpi -I/scratch/build -I/scratch/openmpi/opal/include -I/scratch/build/opal/mca/hwloc/hwloc201/hwloc/include -I/scratch/openmpi/opal/mca/hwloc/hwloc201/hwloc/include   -Drandom=opal_random  -w -Wall -fno-strict-aliasing -pthread -MT evthread.lo -MD -MP -MF $depbase.Tpo -c -o evthread.lo ../../../../../../openmpi/opal/mca/event/libevent2022/libevent/evthread.c &&\
mv -f $depbase.Tpo $depbase.Plo
depbase=`echo evmap.lo | sed 's|[^/]*$|.deps/&|;s|\.lo$||'`;\
/bin/bash ./libtool  --tag=CC   --mode=compile gcc -DHAVE_CONFIG_H -I. -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent  -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/compat -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/include -I./include  -I/scratch/openmpi -I/scratch/build -I/scratch/openmpi/opal/include -I/scratch/build/opal/mca/hwloc/hwloc201/hwloc/include -I/scratch/openmpi/opal/mca/hwloc/hwloc201/hwloc/include   -Drandom=opal_random  -w -Wall -fno-strict-aliasing -pthread -MT evmap.lo -MD -MP -MF $depbase.Tpo -c -o evmap.lo ../../../../../../openmpi/opal/mca/event/libevent2022/libevent/evmap.c &&\
mv -f $depbase.Tpo $depbase.Plo
depbase=`echo log.lo | sed 's|[^/]*$|.deps/&|;s|\.lo$||'`;\
/bin/bash ./libtool  --tag=CC   --mode=compile gcc -DHAVE_CONFIG_H -I. -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent  -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/compat -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/include -I./include  -I/scratch/openmpi -I/scratch/build -I/scratch/openmpi/opal/include -I/scratch/build/opal/mca/hwloc/hwloc201/hwloc/include -I/scratch/openmpi/opal/mca/hwloc/hwloc201/hwloc/include   -Drandom=opal_random  -w -Wall -fno-strict-aliasing -pthread -MT log.lo -MD -MP -MF $depbase.Tpo -c -o log.lo ../../../../../../openmpi/opal/mca/event/libevent2022/libevent/log.c &&\
mv -f $depbase.Tpo $depbase.Plo
libtool: compile:  gcc -DHAVE_CONFIG_H -I. -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/compat -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/include -I./include -I/scratch/openmpi -I/scratch/build -I/scratch/openmpi/opal/include -I/scratch/build/opal/mca/hwloc/hwloc201/hwloc/include -I/scratch/openmpi/opal/mca/hwloc/hwloc201/hwloc/include -Drandom=opal_random -w -Wall -fno-strict-aliasing -pthread -MT evmap.lo -MD -MP -MF .deps/evmap.Tpo -c ../../../../../../openmpi/opal/mca/event/libevent2022/libevent/evmap.c  -fPIC -DPIC -o .libs/evmap.o
libtool: compile:  gcc -DHAVE_CONFIG_H -I. -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/compat -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/include -I./include -I/scratch/openmpi -I/scratch/build -I/scratch/openmpi/opal/include -I/scratch/build/opal/mca/hwloc/hwloc201/hwloc/include -I/scratch/openmpi/opal/mca/hwloc/hwloc201/hwloc/include -Drandom=opal_random -w -Wall -fno-strict-aliasing -pthread -MT event.lo -MD -MP -MF .deps/event.Tpo -c ../../../../../../openmpi/opal/mca/event/libevent2022/libevent/event.c  -fPIC -DPIC -o .libs/event.o
libtool: compile:  gcc -DHAVE_CONFIG_H -I. -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/compat -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/include -I./include -I/scratch/openmpi -I/scratch/build -I/scratch/openmpi/opal/include -I/scratch/build/opal/mca/hwloc/hwloc201/hwloc/include -I/scratch/openmpi/opal/mca/hwloc/hwloc201/hwloc/include -Drandom=opal_random -w -Wall -fno-strict-aliasing -pthread -MT evthread.lo -MD -MP -MF .deps/evthread.Tpo -c ../../../../../../openmpi/opal/mca/event/libevent2022/libevent/evthread.c  -fPIC -DPIC -o .libs/evthread.o
libtool: compile:  gcc -DHAVE_CONFIG_H -I. -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/compat -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/include -I./include -I/scratch/openmpi -I/scratch/build -I/scratch/openmpi/opal/include -I/scratch/build/opal/mca/hwloc/hwloc201/hwloc/include -I/scratch/openmpi/opal/mca/hwloc/hwloc201/hwloc/include -Drandom=opal_random -w -Wall -fno-strict-aliasing -pthread -MT log.lo -MD -MP -MF .deps/log.Tpo -c ../../../../../../openmpi/opal/mca/event/libevent2022/libevent/log.c  -fPIC -DPIC -o .libs/log.o
libtool: compile:  gcc -DHAVE_CONFIG_H -I. -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/compat -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/include -I./include -I/scratch/openmpi -I/scratch/build -I/scratch/openmpi/opal/include -I/scratch/build/opal/mca/hwloc/hwloc201/hwloc/include -I/scratch/openmpi/opal/mca/hwloc/hwloc201/hwloc/include -Drandom=opal_random -w -Wall -fno-strict-aliasing -pthread -MT log.lo -MD -MP -MF .deps/log.Tpo -c ../../../../../../openmpi/opal/mca/event/libevent2022/libevent/log.c -o log.o >/dev/null 2>&1
libtool: compile:  gcc -DHAVE_CONFIG_H -I. -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/compat -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/include -I./include -I/scratch/openmpi -I/scratch/build -I/scratch/openmpi/opal/include -I/scratch/build/opal/mca/hwloc/hwloc201/hwloc/include -I/scratch/openmpi/opal/mca/hwloc/hwloc201/hwloc/include -Drandom=opal_random -w -Wall -fno-strict-aliasing -pthread -MT evthread.lo -MD -MP -MF .deps/evthread.Tpo -c ../../../../../../openmpi/opal/mca/event/libevent2022/libevent/evthread.c -o evthread.o >/dev/null 2>&1
libtool: compile:  gcc -DHAVE_CONFIG_H -I. -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/compat -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/include -I./include -I/scratch/openmpi -I/scratch/build -I/scratch/openmpi/opal/include -I/scratch/build/opal/mca/hwloc/hwloc201/hwloc/include -I/scratch/openmpi/opal/mca/hwloc/hwloc201/hwloc/include -Drandom=opal_random -w -Wall -fno-strict-aliasing -pthread -MT evmap.lo -MD -MP -MF .deps/evmap.Tpo -c ../../../../../../openmpi/opal/mca/event/libevent2022/libevent/evmap.c -o evmap.o >/dev/null 2>&1
depbase=`echo evutil.lo | sed 's|[^/]*$|.deps/&|;s|\.lo$||'`;\
/bin/bash ./libtool  --tag=CC   --mode=compile gcc -DHAVE_CONFIG_H -I. -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent  -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/compat -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/include -I./include  -I/scratch/openmpi -I/scratch/build -I/scratch/openmpi/opal/include -I/scratch/build/opal/mca/hwloc/hwloc201/hwloc/include -I/scratch/openmpi/opal/mca/hwloc/hwloc201/hwloc/include   -Drandom=opal_random  -w -Wall -fno-strict-aliasing -pthread -MT evutil.lo -MD -MP -MF $depbase.Tpo -c -o evutil.lo ../../../../../../openmpi/opal/mca/event/libevent2022/libevent/evutil.c &&\
mv -f $depbase.Tpo $depbase.Plo
depbase=`echo evutil_rand.lo | sed 's|[^/]*$|.deps/&|;s|\.lo$||'`;\
/bin/bash ./libtool  --tag=CC   --mode=compile gcc -DHAVE_CONFIG_H -I. -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent  -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/compat -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/include -I./include  -I/scratch/openmpi -I/scratch/build -I/scratch/openmpi/opal/include -I/scratch/build/opal/mca/hwloc/hwloc201/hwloc/include -I/scratch/openmpi/opal/mca/hwloc/hwloc201/hwloc/include   -Drandom=opal_random  -w -Wall -fno-strict-aliasing -pthread -MT evutil_rand.lo -MD -MP -MF $depbase.Tpo -c -o evutil_rand.lo ../../../../../../openmpi/opal/mca/event/libevent2022/libevent/evutil_rand.c &&\
mv -f $depbase.Tpo $depbase.Plo
depbase=`echo strlcpy.lo | sed 's|[^/]*$|.deps/&|;s|\.lo$||'`;\
/bin/bash ./libtool  --tag=CC   --mode=compile gcc -DHAVE_CONFIG_H -I. -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent  -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/compat -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/include -I./include  -I/scratch/openmpi -I/scratch/build -I/scratch/openmpi/opal/include -I/scratch/build/opal/mca/hwloc/hwloc201/hwloc/include -I/scratch/openmpi/opal/mca/hwloc/hwloc201/hwloc/include   -Drandom=opal_random  -w -Wall -fno-strict-aliasing -pthread -MT strlcpy.lo -MD -MP -MF $depbase.Tpo -c -o strlcpy.lo ../../../../../../openmpi/opal/mca/event/libevent2022/libevent/strlcpy.c &&\
mv -f $depbase.Tpo $depbase.Plo
libtool: compile:  gcc -DHAVE_CONFIG_H -I. -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/compat -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/include -I./include -I/scratch/openmpi -I/scratch/build -I/scratch/openmpi/opal/include -I/scratch/build/opal/mca/hwloc/hwloc201/hwloc/include -I/scratch/openmpi/opal/mca/hwloc/hwloc201/hwloc/include -Drandom=opal_random -w -Wall -fno-strict-aliasing -pthread -MT evutil.lo -MD -MP -MF .deps/evutil.Tpo -c ../../../../../../openmpi/opal/mca/event/libevent2022/libevent/evutil.c  -fPIC -DPIC -o .libs/evutil.o
libtool: compile:  gcc -DHAVE_CONFIG_H -I. -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/compat -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/include -I./include -I/scratch/openmpi -I/scratch/build -I/scratch/openmpi/opal/include -I/scratch/build/opal/mca/hwloc/hwloc201/hwloc/include -I/scratch/openmpi/opal/mca/hwloc/hwloc201/hwloc/include -Drandom=opal_random -w -Wall -fno-strict-aliasing -pthread -MT evutil_rand.lo -MD -MP -MF .deps/evutil_rand.Tpo -c ../../../../../../openmpi/opal/mca/event/libevent2022/libevent/evutil_rand.c  -fPIC -DPIC -o .libs/evutil_rand.o
libtool: compile:  gcc -DHAVE_CONFIG_H -I. -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/compat -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/include -I./include -I/scratch/openmpi -I/scratch/build -I/scratch/openmpi/opal/include -I/scratch/build/opal/mca/hwloc/hwloc201/hwloc/include -I/scratch/openmpi/opal/mca/hwloc/hwloc201/hwloc/include -Drandom=opal_random -w -Wall -fno-strict-aliasing -pthread -MT event.lo -MD -MP -MF .deps/event.Tpo -c ../../../../../../openmpi/opal/mca/event/libevent2022/libevent/event.c -o event.o >/dev/null 2>&1
libtool: compile:  gcc -DHAVE_CONFIG_H -I. -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/compat -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/include -I./include -I/scratch/openmpi -I/scratch/build -I/scratch/openmpi/opal/include -I/scratch/build/opal/mca/hwloc/hwloc201/hwloc/include -I/scratch/openmpi/opal/mca/hwloc/hwloc201/hwloc/include -Drandom=opal_random -w -Wall -fno-strict-aliasing -pthread -MT strlcpy.lo -MD -MP -MF .deps/strlcpy.Tpo -c ../../../../../../openmpi/opal/mca/event/libevent2022/libevent/strlcpy.c  -fPIC -DPIC -o .libs/strlcpy.o
libtool: compile:  gcc -DHAVE_CONFIG_H -I. -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/compat -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/include -I./include -I/scratch/openmpi -I/scratch/build -I/scratch/openmpi/opal/include -I/scratch/build/opal/mca/hwloc/hwloc201/hwloc/include -I/scratch/openmpi/opal/mca/hwloc/hwloc201/hwloc/include -Drandom=opal_random -w -Wall -fno-strict-aliasing -pthread -MT strlcpy.lo -MD -MP -MF .deps/strlcpy.Tpo -c ../../../../../../openmpi/opal/mca/event/libevent2022/libevent/strlcpy.c -o strlcpy.o >/dev/null 2>&1
libtool: compile:  gcc -DHAVE_CONFIG_H -I. -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/compat -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/include -I./include -I/scratch/openmpi -I/scratch/build -I/scratch/openmpi/opal/include -I/scratch/build/opal/mca/hwloc/hwloc201/hwloc/include -I/scratch/openmpi/opal/mca/hwloc/hwloc201/hwloc/include -Drandom=opal_random -w -Wall -fno-strict-aliasing -pthread -MT evutil_rand.lo -MD -MP -MF .deps/evutil_rand.Tpo -c ../../../../../../openmpi/opal/mca/event/libevent2022/libevent/evutil_rand.c -o evutil_rand.o >/dev/null 2>&1
depbase=`echo select.lo | sed 's|[^/]*$|.deps/&|;s|\.lo$||'`;\
/bin/bash ./libtool  --tag=CC   --mode=compile gcc -DHAVE_CONFIG_H -I. -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent  -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/compat -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/include -I./include  -I/scratch/openmpi -I/scratch/build -I/scratch/openmpi/opal/include -I/scratch/build/opal/mca/hwloc/hwloc201/hwloc/include -I/scratch/openmpi/opal/mca/hwloc/hwloc201/hwloc/include   -Drandom=opal_random  -w -Wall -fno-strict-aliasing -pthread -MT select.lo -MD -MP -MF $depbase.Tpo -c -o select.lo ../../../../../../openmpi/opal/mca/event/libevent2022/libevent/select.c &&\
mv -f $depbase.Tpo $depbase.Plo
libtool: compile:  gcc -DHAVE_CONFIG_H -I. -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/compat -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/include -I./include -I/scratch/openmpi -I/scratch/build -I/scratch/openmpi/opal/include -I/scratch/build/opal/mca/hwloc/hwloc201/hwloc/include -I/scratch/openmpi/opal/mca/hwloc/hwloc201/hwloc/include -Drandom=opal_random -w -Wall -fno-strict-aliasing -pthread -MT evutil.lo -MD -MP -MF .deps/evutil.Tpo -c ../../../../../../openmpi/opal/mca/event/libevent2022/libevent/evutil.c -o evutil.o >/dev/null 2>&1
depbase=`echo poll.lo | sed 's|[^/]*$|.deps/&|;s|\.lo$||'`;\
/bin/bash ./libtool  --tag=CC   --mode=compile gcc -DHAVE_CONFIG_H -I. -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent  -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/compat -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/include -I./include  -I/scratch/openmpi -I/scratch/build -I/scratch/openmpi/opal/include -I/scratch/build/opal/mca/hwloc/hwloc201/hwloc/include -I/scratch/openmpi/opal/mca/hwloc/hwloc201/hwloc/include   -Drandom=opal_random  -w -Wall -fno-strict-aliasing -pthread -MT poll.lo -MD -MP -MF $depbase.Tpo -c -o poll.lo ../../../../../../openmpi/opal/mca/event/libevent2022/libevent/poll.c &&\
mv -f $depbase.Tpo $depbase.Plo
libtool: compile:  gcc -DHAVE_CONFIG_H -I. -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/compat -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/include -I./include -I/scratch/openmpi -I/scratch/build -I/scratch/openmpi/opal/include -I/scratch/build/opal/mca/hwloc/hwloc201/hwloc/include -I/scratch/openmpi/opal/mca/hwloc/hwloc201/hwloc/include -Drandom=opal_random -w -Wall -fno-strict-aliasing -pthread -MT select.lo -MD -MP -MF .deps/select.Tpo -c ../../../../../../openmpi/opal/mca/event/libevent2022/libevent/select.c  -fPIC -DPIC -o .libs/select.o
libtool: compile:  gcc -DHAVE_CONFIG_H -I. -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/compat -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/include -I./include -I/scratch/openmpi -I/scratch/build -I/scratch/openmpi/opal/include -I/scratch/build/opal/mca/hwloc/hwloc201/hwloc/include -I/scratch/openmpi/opal/mca/hwloc/hwloc201/hwloc/include -Drandom=opal_random -w -Wall -fno-strict-aliasing -pthread -MT poll.lo -MD -MP -MF .deps/poll.Tpo -c ../../../../../../openmpi/opal/mca/event/libevent2022/libevent/poll.c  -fPIC -DPIC -o .libs/poll.o
depbase=`echo epoll.lo | sed 's|[^/]*$|.deps/&|;s|\.lo$||'`;\
/bin/bash ./libtool  --tag=CC   --mode=compile gcc -DHAVE_CONFIG_H -I. -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent  -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/compat -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/include -I./include  -I/scratch/openmpi -I/scratch/build -I/scratch/openmpi/opal/include -I/scratch/build/opal/mca/hwloc/hwloc201/hwloc/include -I/scratch/openmpi/opal/mca/hwloc/hwloc201/hwloc/include   -Drandom=opal_random  -w -Wall -fno-strict-aliasing -pthread -MT epoll.lo -MD -MP -MF $depbase.Tpo -c -o epoll.lo ../../../../../../openmpi/opal/mca/event/libevent2022/libevent/epoll.c &&\
mv -f $depbase.Tpo $depbase.Plo
libtool: compile:  gcc -DHAVE_CONFIG_H -I. -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/compat -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/include -I./include -I/scratch/openmpi -I/scratch/build -I/scratch/openmpi/opal/include -I/scratch/build/opal/mca/hwloc/hwloc201/hwloc/include -I/scratch/openmpi/opal/mca/hwloc/hwloc201/hwloc/include -Drandom=opal_random -w -Wall -fno-strict-aliasing -pthread -MT select.lo -MD -MP -MF .deps/select.Tpo -c ../../../../../../openmpi/opal/mca/event/libevent2022/libevent/select.c -o select.o >/dev/null 2>&1
depbase=`echo signal.lo | sed 's|[^/]*$|.deps/&|;s|\.lo$||'`;\
/bin/bash ./libtool  --tag=CC   --mode=compile gcc -DHAVE_CONFIG_H -I. -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent  -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/compat -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/include -I./include  -I/scratch/openmpi -I/scratch/build -I/scratch/openmpi/opal/include -I/scratch/build/opal/mca/hwloc/hwloc201/hwloc/include -I/scratch/openmpi/opal/mca/hwloc/hwloc201/hwloc/include   -Drandom=opal_random  -w -Wall -fno-strict-aliasing -pthread -MT signal.lo -MD -MP -MF $depbase.Tpo -c -o signal.lo ../../../../../../openmpi/opal/mca/event/libevent2022/libevent/signal.c &&\
mv -f $depbase.Tpo $depbase.Plo
libtool: compile:  gcc -DHAVE_CONFIG_H -I. -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/compat -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/include -I./include -I/scratch/openmpi -I/scratch/build -I/scratch/openmpi/opal/include -I/scratch/build/opal/mca/hwloc/hwloc201/hwloc/include -I/scratch/openmpi/opal/mca/hwloc/hwloc201/hwloc/include -Drandom=opal_random -w -Wall -fno-strict-aliasing -pthread -MT poll.lo -MD -MP -MF .deps/poll.Tpo -c ../../../../../../openmpi/opal/mca/event/libevent2022/libevent/poll.c -o poll.o >/dev/null 2>&1
libtool: compile:  gcc -DHAVE_CONFIG_H -I. -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/compat -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/include -I./include -I/scratch/openmpi -I/scratch/build -I/scratch/openmpi/opal/include -I/scratch/build/opal/mca/hwloc/hwloc201/hwloc/include -I/scratch/openmpi/opal/mca/hwloc/hwloc201/hwloc/include -Drandom=opal_random -w -Wall -fno-strict-aliasing -pthread -MT epoll.lo -MD -MP -MF .deps/epoll.Tpo -c ../../../../../../openmpi/opal/mca/event/libevent2022/libevent/epoll.c  -fPIC -DPIC -o .libs/epoll.o
libtool: compile:  gcc -DHAVE_CONFIG_H -I. -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/compat -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/include -I./include -I/scratch/openmpi -I/scratch/build -I/scratch/openmpi/opal/include -I/scratch/build/opal/mca/hwloc/hwloc201/hwloc/include -I/scratch/openmpi/opal/mca/hwloc/hwloc201/hwloc/include -Drandom=opal_random -w -Wall -fno-strict-aliasing -pthread -MT signal.lo -MD -MP -MF .deps/signal.Tpo -c ../../../../../../openmpi/opal/mca/event/libevent2022/libevent/signal.c  -fPIC -DPIC -o .libs/signal.o
depbase=`echo evthread_pthread.lo | sed 's|[^/]*$|.deps/&|;s|\.lo$||'`;\
/bin/bash ./libtool  --tag=CC   --mode=compile gcc -DHAVE_CONFIG_H -I. -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent  -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/compat -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/include -I./include  -I/scratch/openmpi -I/scratch/build -I/scratch/openmpi/opal/include -I/scratch/build/opal/mca/hwloc/hwloc201/hwloc/include -I/scratch/openmpi/opal/mca/hwloc/hwloc201/hwloc/include   -Drandom=opal_random  -w -Wall -fno-strict-aliasing -pthread -MT evthread_pthread.lo -MD -MP -MF $depbase.Tpo -c -o evthread_pthread.lo ../../../../../../openmpi/opal/mca/event/libevent2022/libevent/evthread_pthread.c &&\
mv -f $depbase.Tpo $depbase.Plo
libtool: compile:  gcc -DHAVE_CONFIG_H -I. -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/compat -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/include -I./include -I/scratch/openmpi -I/scratch/build -I/scratch/openmpi/opal/include -I/scratch/build/opal/mca/hwloc/hwloc201/hwloc/include -I/scratch/openmpi/opal/mca/hwloc/hwloc201/hwloc/include -Drandom=opal_random -w -Wall -fno-strict-aliasing -pthread -MT epoll.lo -MD -MP -MF .deps/epoll.Tpo -c ../../../../../../openmpi/opal/mca/event/libevent2022/libevent/epoll.c -o epoll.o >/dev/null 2>&1
libtool: compile:  gcc -DHAVE_CONFIG_H -I. -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/compat -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/include -I./include -I/scratch/openmpi -I/scratch/build -I/scratch/openmpi/opal/include -I/scratch/build/opal/mca/hwloc/hwloc201/hwloc/include -I/scratch/openmpi/opal/mca/hwloc/hwloc201/hwloc/include -Drandom=opal_random -w -Wall -fno-strict-aliasing -pthread -MT signal.lo -MD -MP -MF .deps/signal.Tpo -c ../../../../../../openmpi/opal/mca/event/libevent2022/libevent/signal.c -o signal.o >/dev/null 2>&1
libtool: compile:  gcc -DHAVE_CONFIG_H -I. -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/compat -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/include -I./include -I/scratch/openmpi -I/scratch/build -I/scratch/openmpi/opal/include -I/scratch/build/opal/mca/hwloc/hwloc201/hwloc/include -I/scratch/openmpi/opal/mca/hwloc/hwloc201/hwloc/include -Drandom=opal_random -w -Wall -fno-strict-aliasing -pthread -MT evthread_pthread.lo -MD -MP -MF .deps/evthread_pthread.Tpo -c ../../../../../../openmpi/opal/mca/event/libevent2022/libevent/evthread_pthread.c  -fPIC -DPIC -o .libs/evthread_pthread.o
libtool: compile:  gcc -DHAVE_CONFIG_H -I. -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/compat -I../../../../../../openmpi/opal/mca/event/libevent2022/libevent/include -I./include -I/scratch/openmpi -I/scratch/build -I/scratch/openmpi/opal/include -I/scratch/build/opal/mca/hwloc/hwloc201/hwloc/include -I/scratch/openmpi/opal/mca/hwloc/hwloc201/hwloc/include -Drandom=opal_random -w -Wall -fno-strict-aliasing -pthread -MT evthread_pthread.lo -MD -MP -MF .deps/evthread_pthread.Tpo -c ../../../../../../openmpi/opal/mca/event/libevent2022/libevent/evthread_pthread.c -o evthread_pthread.o >/dev/null 2>&1
/bin/bash ./libtool  --tag=CC   --mode=link gcc  -w -Wall -fno-strict-aliasing -pthread   -o libevent.la  event.lo evthread.lo evmap.lo log.lo evutil.lo evutil_rand.lo strlcpy.lo select.lo poll.lo   epoll.lo  signal.lo    evthread_pthread.lo    
libtool: link: ar cru .libs/libevent.a .libs/event.o .libs/evthread.o .libs/evmap.o .libs/log.o .libs/evutil.o .libs/evutil_rand.o .libs/strlcpy.o .libs/select.o .libs/poll.o .libs/epoll.o .libs/signal.o .libs/evthread_pthread.o 
ar: `u' modifier ignored since `D' is the default (see `U')
libtool: link: ranlib .libs/libevent.a
libtool: link: ( cd ".libs" && rm -f "libevent.la" && ln -s "../libevent.la" "libevent.la" )
make[5]: Entering directory '/scratch/build/opal/mca/event/libevent2022/libevent'
make[5]: Nothing to be done for 'install-exec-am'.
make[5]: Nothing to be done for 'install-data-am'.
make[5]: Leaving directory '/scratch/build/opal/mca/event/libevent2022/libevent'
make[4]: Leaving directory '/scratch/build/opal/mca/event/libevent2022/libevent'
Making install in include
make[4]: Entering directory '/scratch/build/opal/mca/event/libevent2022/libevent/include'
make[5]: Entering directory '/scratch/build/opal/mca/event/libevent2022/libevent/include'
make[5]: Nothing to be done for 'install-exec-am'.
make[5]: Nothing to be done for 'install-data-am'.
make[5]: Leaving directory '/scratch/build/opal/mca/event/libevent2022/libevent/include'
make[4]: Leaving directory '/scratch/build/opal/mca/event/libevent2022/libevent/include'
make[3]: Leaving directory '/scratch/build/opal/mca/event/libevent2022/libevent'
make[3]: Entering directory '/scratch/build/opal/mca/event/libevent2022'
  CC       libevent2022_component.lo
  CC       libevent2022_module.lo
  CCLD     libmca_event_libevent2022.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[4]: Entering directory '/scratch/build/opal/mca/event/libevent2022'
make[4]: Nothing to be done for 'install-exec-am'.
make[4]: Leaving directory '/scratch/build/opal/mca/event/libevent2022'
make[3]: Leaving directory '/scratch/build/opal/mca/event/libevent2022'
make[2]: Leaving directory '/scratch/build/opal/mca/event/libevent2022'
Making install in mca/hwloc/hwloc201
make[2]: Entering directory '/scratch/build/opal/mca/hwloc/hwloc201'
Making install in hwloc
make[3]: Entering directory '/scratch/build/opal/mca/hwloc/hwloc201/hwloc'
Making install in include
make[4]: Entering directory '/scratch/build/opal/mca/hwloc/hwloc201/hwloc/include'
make[5]: Entering directory '/scratch/build/opal/mca/hwloc/hwloc201/hwloc/include'
make[5]: Nothing to be done for 'install-exec-am'.
make[5]: Leaving directory '/scratch/build/opal/mca/hwloc/hwloc201/hwloc/include'
make[4]: Leaving directory '/scratch/build/opal/mca/hwloc/hwloc201/hwloc/include'
Making install in hwloc
make[4]: Entering directory '/scratch/build/opal/mca/hwloc/hwloc201/hwloc/hwloc'
  CC       topology.lo
  CC       traversal.lo
  CC       distances.lo
  CC       components.lo
  CC       bind.lo
  CC       bitmap.lo
  CC       pci-common.lo
  CC       diff.lo
  CC       shmem.lo
  CC       misc.lo
  CC       base64.lo
  CC       topology-noos.lo
  CC       topology-synthetic.lo
  CC       topology-xml.lo
  CC       topology-xml-nolibxml.lo
  CC       topology-linux.lo
  CC       topology-hardwired.lo
  CC       topology-x86.lo
  CCLD     libhwloc_embedded.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[5]: Entering directory '/scratch/build/opal/mca/hwloc/hwloc201/hwloc/hwloc'
make  install-exec-hook
make[6]: Entering directory '/scratch/build/opal/mca/hwloc/hwloc201/hwloc/hwloc'
make[6]: Nothing to be done for 'install-exec-hook'.
make[6]: Leaving directory '/scratch/build/opal/mca/hwloc/hwloc201/hwloc/hwloc'
make[5]: Leaving directory '/scratch/build/opal/mca/hwloc/hwloc201/hwloc/hwloc'
make[4]: Leaving directory '/scratch/build/opal/mca/hwloc/hwloc201/hwloc/hwloc'
make[4]: Entering directory '/scratch/build/opal/mca/hwloc/hwloc201/hwloc'
make[5]: Entering directory '/scratch/build/opal/mca/hwloc/hwloc201/hwloc'
make[5]: Nothing to be done for 'install-exec-am'.
make[5]: Leaving directory '/scratch/build/opal/mca/hwloc/hwloc201/hwloc'
make[4]: Leaving directory '/scratch/build/opal/mca/hwloc/hwloc201/hwloc'
make[3]: Leaving directory '/scratch/build/opal/mca/hwloc/hwloc201/hwloc'
make[3]: Entering directory '/scratch/build/opal/mca/hwloc/hwloc201'
  CC       hwloc201_component.lo
  CCLD     libmca_hwloc_hwloc201.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[4]: Entering directory '/scratch/build/opal/mca/hwloc/hwloc201'
make[4]: Nothing to be done for 'install-exec-am'.
make[4]: Leaving directory '/scratch/build/opal/mca/hwloc/hwloc201'
make[3]: Leaving directory '/scratch/build/opal/mca/hwloc/hwloc201'
make[2]: Leaving directory '/scratch/build/opal/mca/hwloc/hwloc201'
Making install in mca/if/linux_ipv6
make[2]: Entering directory '/scratch/build/opal/mca/if/linux_ipv6'
  CC       if_linux_ipv6.lo
 ---> Removed intermediate container 20c1195e80a8
 ---> a89b2fde6677
Step 9/18 : RUN CMAKE_KEY=2D2CEF1034921684 &&     CMAKE_URL=https://github.com/Kitware/CMake/releases/download/v${CMAKE_VERSION} &&     CMAKE_SCRIPT=cmake-${CMAKE_VERSION}-Linux-x86_64.sh &&     CMAKE_SHA256=cmake-${CMAKE_VERSION}-SHA-256.txt &&     wget --quiet ${CMAKE_URL}/${CMAKE_SHA256} &&     wget --quiet ${CMAKE_URL}/${CMAKE_SHA256}.asc &&     wget --quiet ${CMAKE_URL}/${CMAKE_SCRIPT} &&     gpg --verify ${CMAKE_SHA256}.asc ${CMAKE_SHA256} &&     grep -i ${CMAKE_SCRIPT} ${CMAKE_SHA256} | sed -e s/linux/Linux/ | sha256sum --check &&     mkdir -p ${CMAKE_DIR} &&     sh ${CMAKE_SCRIPT} --skip-license --prefix=${CMAKE_DIR} &&     rm cmake*
 ---> Running in 66023e29de7c
  CCLD     libmca_if_linux_ipv6.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/opal/mca/if/linux_ipv6'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Nothing to be done for 'install-data-am'.
make[3]: Leaving directory '/scratch/build/opal/mca/if/linux_ipv6'
make[2]: Leaving directory '/scratch/build/opal/mca/if/linux_ipv6'
Making install in mca/if/posix_ipv4
make[2]: Entering directory '/scratch/build/opal/mca/if/posix_ipv4'
  CC       if_posix.lo
  CCLD     libmca_if_posix_ipv4.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/opal/mca/if/posix_ipv4'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Nothing to be done for 'install-data-am'.
make[3]: Leaving directory '/scratch/build/opal/mca/if/posix_ipv4'
make[2]: Leaving directory '/scratch/build/opal/mca/if/posix_ipv4'
Making install in mca/installdirs/env
make[2]: Entering directory '/scratch/build/opal/mca/installdirs/env'
  CC       opal_installdirs_env.lo
  CCLD     libmca_installdirs_env.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/opal/mca/installdirs/env'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Nothing to be done for 'install-data-am'.
make[3]: Leaving directory '/scratch/build/opal/mca/installdirs/env'
make[2]: Leaving directory '/scratch/build/opal/mca/installdirs/env'
Making install in mca/installdirs/config
make[2]: Entering directory '/scratch/build/opal/mca/installdirs/config'
  CC       opal_installdirs_config.lo
  CCLD     libmca_installdirs_config.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/opal/mca/installdirs/config'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Nothing to be done for 'install-data-am'.
make[3]: Leaving directory '/scratch/build/opal/mca/installdirs/config'
make[2]: Leaving directory '/scratch/build/opal/mca/installdirs/config'
Making install in mca/memory/patcher
make[2]: Entering directory '/scratch/build/opal/mca/memory/patcher'
  CC       memory_patcher_component.lo
  CCLD     libmca_memory_patcher.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/opal/mca/memory/patcher'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Nothing to be done for 'install-data-am'.
make[3]: Leaving directory '/scratch/build/opal/mca/memory/patcher'
make[2]: Leaving directory '/scratch/build/opal/mca/memory/patcher'
Making install in mca/timer/linux
make[2]: Entering directory '/scratch/build/opal/mca/timer/linux'
  CC       timer_linux_component.lo
  CCLD     libmca_timer_linux.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/opal/mca/timer/linux'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../../../../openmpi/opal/mca/timer/linux/help-opal-timer-linux.txt '/opt/openmpi/share/openmpi'
make[3]: Leaving directory '/scratch/build/opal/mca/timer/linux'
make[2]: Leaving directory '/scratch/build/opal/mca/timer/linux'
Making install in .
make[2]: Entering directory '/scratch/build/opal'
  CC       class/opal_bitmap.lo
  CC       class/opal_free_list.lo
  CC       class/opal_hash_table.lo
  CC       class/opal_hotel.lo
  CC       class/opal_tree.lo
  CC       class/opal_list.lo
  CC       class/opal_object.lo
  CC       class/opal_graph.lo
  CC       class/opal_lifo.lo
  CC       class/opal_fifo.lo
  CC       class/opal_pointer_array.lo
  CC       class/opal_value_array.lo
  CC       class/opal_ring_buffer.lo
  CC       class/opal_rb_tree.lo
  CC       class/opal_interval_tree.lo
  CC       memoryhooks/memory.lo
  CC       runtime/opal_progress.lo
  CC       runtime/opal_finalize.lo
  CC       runtime/opal_init.lo
  CC       runtime/opal_params.lo
  CC       runtime/opal_cr.lo
  CC       runtime/opal_info_support.lo
  CC       runtime/opal_progress_threads.lo
  CC       threads/condition.lo
  CC       threads/mutex.lo
  CC       threads/thread.lo
  CC       threads/wait_sync.lo
  CC       dss/dss_internal_functions.lo
  CC       dss/dss_compare.lo
  CC       dss/dss_copy.lo
  CC       dss/dss_dump.lo
  CC       dss/dss_load_unload.lo
  CC       dss/dss_lookup.lo
  CC       dss/dss_pack.lo
  CC       dss/dss_peek.lo
  CC       dss/dss_print.lo
  CC       dss/dss_register.lo
  CC       dss/dss_unpack.lo
  CC       dss/dss_open_close.lo
  CCLD     libopen-pal.la
make[3]: Entering directory '/scratch/build/opal'
 /usr/bin/mkdir -p '/opt/openmpi/lib'
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /bin/bash ../libtool   --mode=install /usr/bin/install -c   libopen-pal.la '/opt/openmpi/lib'
 /usr/bin/install -c -m 644 ../../openmpi/opal/runtime/help-opal-runtime.txt ../../openmpi/opal/runtime/help-opal_info.txt '/opt/openmpi/share/openmpi'
libtool: install: /usr/bin/install -c .libs/libopen-pal.so.40.20.2 /opt/openmpi/lib/libopen-pal.so.40.20.2
libtool: install: (cd /opt/openmpi/lib && { ln -s -f libopen-pal.so.40.20.2 libopen-pal.so.40 || { rm -f libopen-pal.so.40 && ln -s libopen-pal.so.40.20.2 libopen-pal.so.40; }; })
libtool: install: (cd /opt/openmpi/lib && { ln -s -f libopen-pal.so.40.20.2 libopen-pal.so || { rm -f libopen-pal.so && ln -s libopen-pal.so.40.20.2 libopen-pal.so; }; })
libtool: install: /usr/bin/install -c .libs/libopen-pal.lai /opt/openmpi/lib/libopen-pal.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/opal'
make[2]: Leaving directory '/scratch/build/opal'
Making install in mca/common/cuda
make[2]: Entering directory '/scratch/build/opal/mca/common/cuda'
  CC       common_cuda.lo
  LN_S     libmca_common_cuda.la
gpg: Signature made Tue Apr  4 19:39:52 2023 UTC
gpg:                using RSA key C6C265324BBEBDC350B513D02D2CEF1034921684
gpg: Good signature from "Brad King" [unknown]
gpg:                 aka "Brad King <brad.king@kitware.com>" [unknown]
gpg:                 aka "[jpeg image of size 4005]" [unknown]
gpg: Note: This key has expired!
Primary key fingerprint: CBA2 3971 357C 2E65 90D9  EFD3 EC8F EF3A 7BFB 4EDA
     Subkey fingerprint: C6C2 6532 4BBE BDC3 50B5  13D0 2D2C EF10 3492 1684
cmake-3.26.3-Linux-x86_64.sh: OK
CMake Installer Version: 3.26.3, Copyright (c) Kitware
This is a self-extracting archive.
The archive will be extracted to: /opt/cmake

Using target directory: /opt/cmake
Extracting, please wait...

  CCLD     libmca_common_cuda.la
make[3]: Entering directory '/scratch/build/opal/mca/common/cuda'
 /usr/bin/mkdir -p '/opt/openmpi/lib'
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   libmca_common_cuda.la '/opt/openmpi/lib'
 /usr/bin/install -c -m 644 ../../../../../openmpi/opal/mca/common/cuda/help-mpi-common-cuda.txt '/opt/openmpi/share/openmpi'
libtool: install: /usr/bin/install -c .libs/libmca_common_cuda.so.40.20.0 /opt/openmpi/lib/libmca_common_cuda.so.40.20.0
libtool: install: (cd /opt/openmpi/lib && { ln -s -f libmca_common_cuda.so.40.20.0 libmca_common_cuda.so.40 || { rm -f libmca_common_cuda.so.40 && ln -s libmca_common_cuda.so.40.20.0 libmca_common_cuda.so.40; }; })
libtool: install: (cd /opt/openmpi/lib && { ln -s -f libmca_common_cuda.so.40.20.0 libmca_common_cuda.so || { rm -f libmca_common_cuda.so && ln -s libmca_common_cuda.so.40.20.0 libmca_common_cuda.so; }; })
libtool: install: /usr/bin/install -c .libs/libmca_common_cuda.lai /opt/openmpi/lib/libmca_common_cuda.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/opal/mca/common/cuda'
make[2]: Leaving directory '/scratch/build/opal/mca/common/cuda'
Making install in mca/common/sm
make[2]: Entering directory '/scratch/build/opal/mca/common/sm'
  CC       common_sm.lo
  CC       common_sm_mpool.lo
  LN_S     libmca_common_sm.la
  CCLD     libmca_common_sm.la
make[3]: Entering directory '/scratch/build/opal/mca/common/sm'
 /usr/bin/mkdir -p '/opt/openmpi/lib'
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   libmca_common_sm.la '/opt/openmpi/lib'
 /usr/bin/install -c -m 644 ../../../../../openmpi/opal/mca/common/sm/help-mpi-common-sm.txt '/opt/openmpi/share/openmpi'
libtool: install: /usr/bin/install -c .libs/libmca_common_sm.so.40.20.0 /opt/openmpi/lib/libmca_common_sm.so.40.20.0
libtool: install: (cd /opt/openmpi/lib && { ln -s -f libmca_common_sm.so.40.20.0 libmca_common_sm.so.40 || { rm -f libmca_common_sm.so.40 && ln -s libmca_common_sm.so.40.20.0 libmca_common_sm.so.40; }; })
libtool: install: (cd /opt/openmpi/lib && { ln -s -f libmca_common_sm.so.40.20.0 libmca_common_sm.so || { rm -f libmca_common_sm.so && ln -s libmca_common_sm.so.40.20.0 libmca_common_sm.so; }; })
libtool: install: /usr/bin/install -c .libs/libmca_common_sm.lai /opt/openmpi/lib/libmca_common_sm.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/opal/mca/common/sm'
make[2]: Leaving directory '/scratch/build/opal/mca/common/sm'
Making install in mca/allocator/basic
make[2]: Entering directory '/scratch/build/opal/mca/allocator/basic'
  CC       allocator_basic.lo
  CCLD     mca_allocator_basic.la
make[3]: Entering directory '/scratch/build/opal/mca/allocator/basic'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_allocator_basic.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_allocator_basic.la'
libtool: install: (cd /scratch/build/opal/mca/allocator/basic; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_allocator_basic.la -rpath /opt/openmpi/lib/openmpi allocator_basic.lo ../../../../opal/libopen-pal.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_allocator_basic.soT /opt/openmpi/lib/openmpi/mca_allocator_basic.so
libtool: install: /usr/bin/install -c .libs/mca_allocator_basic.lai /opt/openmpi/lib/openmpi/mca_allocator_basic.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/opal/mca/allocator/basic'
make[2]: Leaving directory '/scratch/build/opal/mca/allocator/basic'
Making install in mca/allocator/bucket
make[2]: Entering directory '/scratch/build/opal/mca/allocator/bucket'
  CC       allocator_bucket.lo
  CC       allocator_bucket_alloc.lo
  CCLD     mca_allocator_bucket.la
make[3]: Entering directory '/scratch/build/opal/mca/allocator/bucket'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_allocator_bucket.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_allocator_bucket.la'
libtool: install: (cd /scratch/build/opal/mca/allocator/bucket; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_allocator_bucket.la -rpath /opt/openmpi/lib/openmpi allocator_bucket.lo allocator_bucket_alloc.lo ../../../../opal/libopen-pal.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_allocator_bucket.soT /opt/openmpi/lib/openmpi/mca_allocator_bucket.so
libtool: install: /usr/bin/install -c .libs/mca_allocator_bucket.lai /opt/openmpi/lib/openmpi/mca_allocator_bucket.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/opal/mca/allocator/bucket'
make[2]: Leaving directory '/scratch/build/opal/mca/allocator/bucket'
Making install in mca/btl/self
make[2]: Entering directory '/scratch/build/opal/mca/btl/self'
  CC       btl_self.lo
  CC       btl_self_component.lo
  CC       btl_self_frag.lo
  CCLD     mca_btl_self.la
make[3]: Entering directory '/scratch/build/opal/mca/btl/self'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_btl_self.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_btl_self.la'
libtool: install: (cd /scratch/build/opal/mca/btl/self; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_btl_self.la -rpath /opt/openmpi/lib/openmpi btl_self.lo btl_self_component.lo btl_self_frag.lo ../../../../opal/libopen-pal.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_btl_self.soT /opt/openmpi/lib/openmpi/mca_btl_self.so
libtool: install: /usr/bin/install -c .libs/mca_btl_self.lai /opt/openmpi/lib/openmpi/mca_btl_self.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/opal/mca/btl/self'
make[2]: Leaving directory '/scratch/build/opal/mca/btl/self'
Making install in mca/btl/sm
make[2]: Entering directory '/scratch/build/opal/mca/btl/sm'
  CC       mca_btl_sm_la-btl_sm_component.lo
  CCLD     mca_btl_sm.la
make[3]: Entering directory '/scratch/build/opal/mca/btl/sm'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../../../../openmpi/opal/mca/btl/sm/help-mpi-btl-sm.txt '/opt/openmpi/share/openmpi'
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_btl_sm.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_btl_sm.la'
libtool: install: (cd /scratch/build/opal/mca/btl/sm; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_btl_sm.la -rpath /opt/openmpi/lib/openmpi mca_btl_sm_la-btl_sm_component.lo ../../../../opal/libopen-pal.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_btl_sm.soT /opt/openmpi/lib/openmpi/mca_btl_sm.so
libtool: install: /usr/bin/install -c .libs/mca_btl_sm.lai /opt/openmpi/lib/openmpi/mca_btl_sm.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/opal/mca/btl/sm'
make[2]: Leaving directory '/scratch/build/opal/mca/btl/sm'
Making install in mca/btl/smcuda
make[2]: Entering directory '/scratch/build/opal/mca/btl/smcuda'
  CC       mca_btl_smcuda_la-btl_smcuda.lo
  CC       mca_btl_smcuda_la-btl_smcuda_component.lo
  CC       mca_btl_smcuda_la-btl_smcuda_frag.lo
  CCLD     mca_btl_smcuda.la
make[3]: Entering directory '/scratch/build/opal/mca/btl/smcuda'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../../../../openmpi/opal/mca/btl/smcuda/help-mpi-btl-smcuda.txt '/opt/openmpi/share/openmpi'
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_btl_smcuda.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_btl_smcuda.la'
libtool: install: (cd /scratch/build/opal/mca/btl/smcuda; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_btl_smcuda.la -rpath /opt/openmpi/lib/openmpi mca_btl_smcuda_la-btl_smcuda.lo mca_btl_smcuda_la-btl_smcuda_component.lo mca_btl_smcuda_la-btl_smcuda_frag.lo ../../../../opal/libopen-pal.la /scratch/build/opal/mca/common/sm/libmca_common_sm.la /scratch/build/opal/mca/common/cuda/libmca_common_cuda.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_btl_smcuda.soT /opt/openmpi/lib/openmpi/mca_btl_smcuda.so
libtool: install: /usr/bin/install -c .libs/mca_btl_smcuda.lai /opt/openmpi/lib/openmpi/mca_btl_smcuda.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/opal/mca/btl/smcuda'
make[2]: Leaving directory '/scratch/build/opal/mca/btl/smcuda'
Making install in mca/btl/tcp
make[2]: Entering directory '/scratch/build/opal/mca/btl/tcp'
  CC       btl_tcp.lo
  CC       btl_tcp_component.lo
  CC       btl_tcp_endpoint.lo
  CC       btl_tcp_frag.lo
  CC       btl_tcp_proc.lo
  CC       btl_tcp_ft.lo
  CCLD     mca_btl_tcp.la
make[3]: Entering directory '/scratch/build/opal/mca/btl/tcp'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../../../../openmpi/opal/mca/btl/tcp/help-mpi-btl-tcp.txt '/opt/openmpi/share/openmpi'
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_btl_tcp.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_btl_tcp.la'
libtool: install: (cd /scratch/build/opal/mca/btl/tcp; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_btl_tcp.la -rpath /opt/openmpi/lib/openmpi btl_tcp.lo btl_tcp_component.lo btl_tcp_endpoint.lo btl_tcp_frag.lo btl_tcp_proc.lo btl_tcp_ft.lo ../../../../opal/libopen-pal.la /scratch/build/opal/mca/common/cuda/libmca_common_cuda.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_btl_tcp.soT /opt/openmpi/lib/openmpi/mca_btl_tcp.so
libtool: install: /usr/bin/install -c .libs/mca_btl_tcp.lai /opt/openmpi/lib/openmpi/mca_btl_tcp.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/opal/mca/btl/tcp'
make[2]: Leaving directory '/scratch/build/opal/mca/btl/tcp'
Making install in mca/btl/vader
make[2]: Entering directory '/scratch/build/opal/mca/btl/vader'
  CC       btl_vader_module.lo
  CC       btl_vader_component.lo
  CC       btl_vader_frag.lo
  CC       btl_vader_send.lo
  CC       btl_vader_sendi.lo
  CC       btl_vader_get.lo
  CC       btl_vader_put.lo
  CC       btl_vader_xpmem.lo
  CC       btl_vader_knem.lo
  CC       btl_vader_sc_emu.lo
  CC       btl_vader_atomic.lo
  CCLD     mca_btl_vader.la
make[3]: Entering directory '/scratch/build/opal/mca/btl/vader'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../../../../openmpi/opal/mca/btl/vader/help-btl-vader.txt '/opt/openmpi/share/openmpi'
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_btl_vader.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_btl_vader.la'
libtool: install: (cd /scratch/build/opal/mca/btl/vader; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_btl_vader.la -rpath /opt/openmpi/lib/openmpi btl_vader_module.lo btl_vader_component.lo btl_vader_frag.lo btl_vader_send.lo btl_vader_sendi.lo btl_vader_get.lo btl_vader_put.lo btl_vader_xpmem.lo btl_vader_knem.lo btl_vader_sc_emu.lo btl_vader_atomic.lo ../../../../opal/libopen-pal.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_btl_vader.soT /opt/openmpi/lib/openmpi/mca_btl_vader.so
libtool: install: /usr/bin/install -c .libs/mca_btl_vader.lai /opt/openmpi/lib/openmpi/mca_btl_vader.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/opal/mca/btl/vader'
make[2]: Leaving directory '/scratch/build/opal/mca/btl/vader'
Making install in mca/compress/bzip
make[2]: Entering directory '/scratch/build/opal/mca/compress/bzip'
  CC       compress_bzip_component.lo
  CC       compress_bzip_module.lo
  CCLD     mca_compress_bzip.la
make[3]: Entering directory '/scratch/build/opal/mca/compress/bzip'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_compress_bzip.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_compress_bzip.la'
libtool: install: (cd /scratch/build/opal/mca/compress/bzip; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_compress_bzip.la -rpath /opt/openmpi/lib/openmpi compress_bzip_component.lo compress_bzip_module.lo ../../../../opal/libopen-pal.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_compress_bzip.soT /opt/openmpi/lib/openmpi/mca_compress_bzip.so
libtool: install: /usr/bin/install -c .libs/mca_compress_bzip.lai /opt/openmpi/lib/openmpi/mca_compress_bzip.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/opal/mca/compress/bzip'
make[2]: Leaving directory '/scratch/build/opal/mca/compress/bzip'
Making install in mca/compress/gzip
make[2]: Entering directory '/scratch/build/opal/mca/compress/gzip'
  CC       compress_gzip_component.lo
  CC       compress_gzip_module.lo
  CCLD     mca_compress_gzip.la
make[3]: Entering directory '/scratch/build/opal/mca/compress/gzip'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_compress_gzip.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_compress_gzip.la'
libtool: install: (cd /scratch/build/opal/mca/compress/gzip; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_compress_gzip.la -rpath /opt/openmpi/lib/openmpi compress_gzip_component.lo compress_gzip_module.lo ../../../../opal/libopen-pal.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_compress_gzip.soT /opt/openmpi/lib/openmpi/mca_compress_gzip.so
libtool: install: /usr/bin/install -c .libs/mca_compress_gzip.lai /opt/openmpi/lib/openmpi/mca_compress_gzip.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/opal/mca/compress/gzip'
make[2]: Leaving directory '/scratch/build/opal/mca/compress/gzip'
Making install in mca/crs/none
make[2]: Entering directory '/scratch/build/opal/mca/crs/none'
  CC       crs_none_component.lo
  CC       crs_none_module.lo
  CCLD     mca_crs_none.la
make[3]: Entering directory '/scratch/build/opal/mca/crs/none'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../../../../openmpi/opal/mca/crs/none/help-opal-crs-none.txt '/opt/openmpi/share/openmpi'
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_crs_none.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_crs_none.la'
libtool: install: (cd /scratch/build/opal/mca/crs/none; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_crs_none.la -rpath /opt/openmpi/lib/openmpi crs_none_component.lo crs_none_module.lo ../../../../opal/libopen-pal.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_crs_none.soT /opt/openmpi/lib/openmpi/mca_crs_none.so
libtool: install: /usr/bin/install -c .libs/mca_crs_none.lai /opt/openmpi/lib/openmpi/mca_crs_none.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/opal/mca/crs/none'
make[2]: Leaving directory '/scratch/build/opal/mca/crs/none'
Making install in mca/mpool/hugepage
make[2]: Entering directory '/scratch/build/opal/mca/mpool/hugepage'
  CC       mpool_hugepage_module.lo
  CC       mpool_hugepage_component.lo
  CCLD     mca_mpool_hugepage.la
make[3]: Entering directory '/scratch/build/opal/mca/mpool/hugepage'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_mpool_hugepage.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_mpool_hugepage.la'
libtool: install: (cd /scratch/build/opal/mca/mpool/hugepage; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_mpool_hugepage.la -rpath /opt/openmpi/lib/openmpi mpool_hugepage_module.lo mpool_hugepage_component.lo ../../../../opal/libopen-pal.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_mpool_hugepage.soT /opt/openmpi/lib/openmpi/mca_mpool_hugepage.so
libtool: install: /usr/bin/install -c .libs/mca_mpool_hugepage.lai /opt/openmpi/lib/openmpi/mca_mpool_hugepage.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/opal/mca/mpool/hugepage'
make[2]: Leaving directory '/scratch/build/opal/mca/mpool/hugepage'
Making install in mca/patcher/overwrite
make[2]: Entering directory '/scratch/build/opal/mca/patcher/overwrite'
  CC       patcher_overwrite_module.lo
  CC       patcher_overwrite_component.lo
  CCLD     mca_patcher_overwrite.la
make[3]: Entering directory '/scratch/build/opal/mca/patcher/overwrite'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_patcher_overwrite.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_patcher_overwrite.la'
libtool: install: (cd /scratch/build/opal/mca/patcher/overwrite; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_patcher_overwrite.la -rpath /opt/openmpi/lib/openmpi patcher_overwrite_module.lo patcher_overwrite_component.lo ../../../../opal/libopen-pal.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_patcher_overwrite.soT /opt/openmpi/lib/openmpi/mca_patcher_overwrite.so
libtool: install: /usr/bin/install -c .libs/mca_patcher_overwrite.lai /opt/openmpi/lib/openmpi/mca_patcher_overwrite.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/opal/mca/patcher/overwrite'
make[2]: Leaving directory '/scratch/build/opal/mca/patcher/overwrite'
Making install in mca/pmix/isolated
make[2]: Entering directory '/scratch/build/opal/mca/pmix/isolated'
  CC       pmix_isolated_component.lo
  CC       pmix_isolated.lo
  CCLD     mca_pmix_isolated.la
make[3]: Entering directory '/scratch/build/opal/mca/pmix/isolated'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_pmix_isolated.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_pmix_isolated.la'
libtool: install: (cd /scratch/build/opal/mca/pmix/isolated; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_pmix_isolated.la -rpath /opt/openmpi/lib/openmpi pmix_isolated_component.lo pmix_isolated.lo ../../../../opal/libopen-pal.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_pmix_isolated.soT /opt/openmpi/lib/openmpi/mca_pmix_isolated.so
libtool: install: /usr/bin/install -c .libs/mca_pmix_isolated.lai /opt/openmpi/lib/openmpi/mca_pmix_isolated.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/opal/mca/pmix/isolated'
make[2]: Leaving directory '/scratch/build/opal/mca/pmix/isolated'
Making install in mca/pmix/flux
make[2]: Entering directory '/scratch/build/opal/mca/pmix/flux'
  CC       mca_pmix_flux_la-pmix_flux_component.lo
  CC       mca_pmix_flux_la-pmix_flux.lo
  CCLD     mca_pmix_flux.la
Unpacking finished successfully
make[3]: Entering directory '/scratch/build/opal/mca/pmix/flux'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_pmix_flux.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_pmix_flux.la'
libtool: install: (cd /scratch/build/opal/mca/pmix/flux; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_pmix_flux.la -rpath /opt/openmpi/lib/openmpi mca_pmix_flux_la-pmix_flux_component.lo mca_pmix_flux_la-pmix_flux.lo ../../../../opal/libopen-pal.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_pmix_flux.soT /opt/openmpi/lib/openmpi/mca_pmix_flux.so
libtool: install: /usr/bin/install -c .libs/mca_pmix_flux.lai /opt/openmpi/lib/openmpi/mca_pmix_flux.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/opal/mca/pmix/flux'
make[2]: Leaving directory '/scratch/build/opal/mca/pmix/flux'
Making install in mca/pmix/pmix3x
make[2]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x'
Making install in pmix
make[3]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix'
Making install in config
make[4]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/config'
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/config'
make[5]: Nothing to be done for 'install-exec-am'.
make[5]: Nothing to be done for 'install-data-am'.
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/config'
make[4]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/config'
Making install in contrib
make[4]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/contrib'
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/contrib'
make[5]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/pmix'
 /usr/bin/install -c -m 644 ../../../../../../../openmpi/opal/mca/pmix/pmix3x/pmix/contrib/pmix-valgrind.supp '/opt/openmpi/share/pmix'
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/contrib'
make[4]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/contrib'
Making install in include
make[4]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/include'
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/include'
make[5]: Nothing to be done for 'install-exec-am'.
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/include'
make[4]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/include'
Making install in src
make[4]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src'
Making install in util/keyval
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/util/keyval'
  CC       keyval_lex.lo
  CCLD     libpmixutilkeyval.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[6]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/util/keyval'
make[6]: Nothing to be done for 'install-exec-am'.
make[6]: Nothing to be done for 'install-data-am'.
make[6]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/util/keyval'
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/util/keyval'
Making install in mca/base
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/base'
  CC       pmix_mca_base_close.lo
  CC       pmix_mca_base_cmd_line.lo
  CC       pmix_mca_base_component_find.lo
  CC       pmix_mca_base_component_compare.lo
  CC       pmix_mca_base_component_repository.lo
  CC       pmix_mca_base_components_open.lo
  CC       pmix_mca_base_components_close.lo
  CC       pmix_mca_base_components_select.lo
  CC       pmix_mca_base_list.lo
  CC       pmix_mca_base_open.lo
  CC       pmix_mca_base_var.lo
  CC       pmix_mca_base_var_enum.lo
  CC       pmix_mca_base_var_group.lo
  CC       pmix_mca_base_parse_paramfile.lo
  CC       pmix_mca_base_components_register.lo
  CC       pmix_mca_base_framework.lo
  CCLD     libpmix_mca_base.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[6]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/base'
make[6]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/pmix'
 /usr/bin/install -c -m 644 ../../../../../../../../../openmpi/opal/mca/pmix/pmix3x/pmix/src/mca/base/help-pmix-mca-base.txt ../../../../../../../../../openmpi/opal/mca/pmix/pmix3x/pmix/src/mca/base/help-pmix-mca-var.txt '/opt/openmpi/share/pmix'
make[6]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/base'
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/base'
Making install in mca/common
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/common'
make[6]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/common'
make[6]: Nothing to be done for 'install-exec-am'.
make[6]: Nothing to be done for 'install-data-am'.
make[6]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/common'
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/common'
Making install in mca/bfrops
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/bfrops'
  CC       base/bfrop_base_frame.lo
  CC       base/bfrop_base_select.lo
  CC       base/bfrop_base_fns.lo
  CC       base/bfrop_base_copy.lo
  CC       base/bfrop_base_pack.lo
  CC       base/bfrop_base_print.lo
  CC       base/bfrop_base_unpack.lo
  CC       base/bfrop_base_stubs.lo
  CCLD     libmca_bfrops.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[6]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/bfrops'
make[6]: Nothing to be done for 'install-exec-am'.
make[6]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/bfrops'
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/bfrops'
Making install in mca/gds
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/gds'
  CC       base/gds_base_frame.lo
  CC       base/gds_base_select.lo
  CC       base/gds_base_fns.lo
  CCLD     libmca_gds.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[6]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/gds'
make[6]: Nothing to be done for 'install-exec-am'.
make[6]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/gds'
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/gds'
Making install in mca/pdl
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pdl'
  CC       base/pdl_base_open.lo
  CC       base/pdl_base_close.lo
  CC       base/pdl_base_fns.lo
  CC       base/pdl_base_select.lo
  CCLD     libmca_pdl.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[6]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pdl'
make[6]: Nothing to be done for 'install-exec-am'.
make[6]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pdl'
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pdl'
Making install in mca/pif
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pif'
  CC       base/pif_base_components.lo
  CCLD     libmca_pif.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[6]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pif'
make[6]: Nothing to be done for 'install-exec-am'.
make[6]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pif'
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pif'
Making install in mca/pinstalldirs
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pinstalldirs'
  CC       base/pinstalldirs_base_components.lo
  CC       base/pinstalldirs_base_expand.lo
  CCLD     libmca_pinstalldirs.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[6]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pinstalldirs'
make[6]: Nothing to be done for 'install-exec-am'.
make[6]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pinstalldirs'
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pinstalldirs'
Making install in mca/plog
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/plog'
  CC       base/plog_base_frame.lo
  CC       base/plog_base_select.lo
  CC       base/plog_base_stubs.lo
  CCLD     libmca_plog.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[6]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/plog'
make[6]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/pmix'
 /usr/bin/install -c -m 644 ../../../../../../../../../openmpi/opal/mca/pmix/pmix3x/pmix/src/mca/plog/base/help-pmix-plog.txt '/opt/openmpi/share/pmix'
make[6]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/plog'
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/plog'
Making install in mca/pnet
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pnet'
  CC       base/pnet_base_frame.lo
  CC       base/pnet_base_select.lo
  CC       base/pnet_base_fns.lo
  CCLD     libmca_pnet.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[6]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pnet'
make[6]: Nothing to be done for 'install-exec-am'.
make[6]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pnet'
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pnet'
Making install in mca/preg
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/preg'
  CC       base/preg_base_frame.lo
  CC       base/preg_base_stubs.lo
  CC       base/preg_base_select.lo
  CCLD     libmca_preg.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[6]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/preg'
make[6]: Nothing to be done for 'install-exec-am'.
make[6]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/preg'
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/preg'
Making install in mca/psec
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/psec'
  CC       base/psec_base_frame.lo
  CC       base/psec_base_select.lo
  CC       base/psec_base_fns.lo
  CCLD     libmca_psec.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[6]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/psec'
make[6]: Nothing to be done for 'install-exec-am'.
make[6]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/psec'
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/psec'
Making install in mca/psensor
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/psensor'
  CC       base/psensor_base_frame.lo
  CC       base/psensor_base_select.lo
  CC       base/psensor_base_stubs.lo
  CCLD     libmca_psensor.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[6]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/psensor'
make[6]: Nothing to be done for 'install-exec-am'.
make[6]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/psensor'
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/psensor'
Making install in mca/pshmem
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pshmem'
  CC       base/pshmem_base_frame.lo
  CC       base/pshmem_base_select.lo
  CCLD     libmca_pshmem.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[6]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pshmem'
make[6]: Nothing to be done for 'install-exec-am'.
make[6]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pshmem'
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pshmem'
Making install in mca/ptl
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/ptl'
  CC       base/ptl_base_frame.lo
  CC       base/ptl_base_select.lo
  CC       base/ptl_base_sendrecv.lo
  CC       base/ptl_base_listener.lo
  CC       base/ptl_base_stubs.lo
  CC       base/ptl_base_connect.lo
  CCLD     libmca_ptl.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[6]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/ptl'
make[6]: Nothing to be done for 'install-exec-am'.
make[6]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/ptl'
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/ptl'
Making install in mca/pdl/pdlopen
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pdl/pdlopen'
  CC       pdl_pdlopen_component.lo
  CC       pdl_pdlopen_module.lo
  CCLD     libmca_pdl_pdlopen.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[6]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pdl/pdlopen'
make[6]: Nothing to be done for 'install-exec-am'.
make[6]: Nothing to be done for 'install-data-am'.
make[6]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pdl/pdlopen'
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pdl/pdlopen'
Making install in mca/pif/linux_ipv6
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pif/linux_ipv6'
  CC       pif_linux_ipv6.lo
  CCLD     libmca_pif_linux_ipv6.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[6]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pif/linux_ipv6'
make[6]: Nothing to be done for 'install-exec-am'.
make[6]: Nothing to be done for 'install-data-am'.
make[6]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pif/linux_ipv6'
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pif/linux_ipv6'
Making install in mca/pif/posix_ipv4
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pif/posix_ipv4'
  CC       pif_posix.lo
  CCLD     libmca_pif_posix_ipv4.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[6]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pif/posix_ipv4'
make[6]: Nothing to be done for 'install-exec-am'.
make[6]: Nothing to be done for 'install-data-am'.
make[6]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pif/posix_ipv4'
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pif/posix_ipv4'
Making install in mca/pinstalldirs/env
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pinstalldirs/env'
  CC       pmix_pinstalldirs_env.lo
  CCLD     libmca_pinstalldirs_env.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[6]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pinstalldirs/env'
make[6]: Nothing to be done for 'install-exec-am'.
make[6]: Nothing to be done for 'install-data-am'.
make[6]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pinstalldirs/env'
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pinstalldirs/env'
Making install in mca/pinstalldirs/config
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pinstalldirs/config'
  CC       pmix_pinstalldirs_config.lo
  CCLD     libmca_pinstalldirs_config.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[6]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pinstalldirs/config'
make[6]: Nothing to be done for 'install-exec-am'.
make[6]: Nothing to be done for 'install-data-am'.
make[6]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pinstalldirs/config'
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pinstalldirs/config'
Making install in .
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src'
  CC       threads/mutex.lo
  CC       threads/thread.lo
  CC       threads/wait_sync.lo
  CC       class/pmix_bitmap.lo
  CC       class/pmix_object.lo
  CC       class/pmix_list.lo
  CC       class/pmix_pointer_array.lo
  CC       class/pmix_hash_table.lo
  CC       class/pmix_hotel.lo
  CC       class/pmix_ring_buffer.lo
  CC       class/pmix_value_array.lo
  CC       event/pmix_event_notification.lo
  CC       event/pmix_event_registration.lo
  CC       include/pmix_globals.lo
  CC       util/alfg.lo
  CC       util/argv.lo
  CC       util/cmd_line.lo
  CC       util/error.lo
  CC       util/printf.lo
  CC       util/output.lo
  CC       util/pmix_environ.lo
  CC       util/crc.lo
  CC       util/fd.lo
  CC       util/timings.lo
  CC       util/os_path.lo
  CC       util/basename.lo
  CC       util/keyval_parse.lo
  CC       util/show_help.lo
  CC       util/show_help_lex.lo
  CC       util/path.lo
  CC       util/getid.lo
  CC       util/hash.lo
  CC       util/name_fns.lo
  CC       util/net.lo
  CC       util/pif.lo
  CC       util/parse_options.lo
  CC       util/compress.lo
  CC       client/pmix_client.lo
  CC       client/pmix_client_fence.lo
  CC       client/pmix_client_get.lo
  CC       client/pmix_client_pub.lo
  CC       client/pmix_client_spawn.lo
  CC       client/pmix_client_connect.lo
  CC       server/pmix_server.lo
  CC       server/pmix_server_ops.lo
  CC       server/pmix_server_get.lo
  CC       runtime/pmix_finalize.lo
  CC       runtime/pmix_init.lo
  CC       runtime/pmix_params.lo
  CC       runtime/pmix_progress_threads.lo
  CC       tool/pmix_tool.lo
  CC       common/pmix_query.lo
  CC       common/pmix_strings.lo
  CC       common/pmix_log.lo
  CC       common/pmix_control.lo
  CC       common/pmix_data.lo
  CC       common/pmix_security.lo
  CC       common/pmix_iof.lo
  CC       hwloc/hwloc.lo
  CCLD     libpmix.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[6]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src'
 /usr/bin/mkdir -p '/opt/openmpi/share/pmix'
 /usr/bin/install -c -m 644 ../../../../../../../openmpi/opal/mca/pmix/pmix3x/pmix/src/server/help-pmix-server.txt ../../../../../../../openmpi/opal/mca/pmix/pmix3x/pmix/src/runtime/help-pmix-runtime.txt '/opt/openmpi/share/pmix'
make[6]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src'
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src'
Making install in mca/common/dstore
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/common/dstore'
if test -z "libmca_common_dstore.la"; then \
	rm -f "libmca_common_dstore.la"; \
	ln -s "libmca_common_dstore_noinst.la" "libmca_common_dstore.la"; \
fi
  CC       dstore_base.lo
  CC       dstore_segment.lo
  CCLD     libmca_common_dstore.la
make[6]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/common/dstore'
 /usr/bin/mkdir -p '/opt/openmpi/lib'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   libmca_common_dstore.la '/opt/openmpi/lib'
libtool: install: /usr/bin/install -c .libs/libmca_common_dstore.so.1.0.1 /opt/openmpi/lib/libmca_common_dstore.so.1.0.1
libtool: install: (cd /opt/openmpi/lib && { ln -s -f libmca_common_dstore.so.1.0.1 libmca_common_dstore.so.1 || { rm -f libmca_common_dstore.so.1 && ln -s libmca_common_dstore.so.1.0.1 libmca_common_dstore.so.1; }; })
libtool: install: (cd /opt/openmpi/lib && { ln -s -f libmca_common_dstore.so.1.0.1 libmca_common_dstore.so || { rm -f libmca_common_dstore.so && ln -s libmca_common_dstore.so.1.0.1 libmca_common_dstore.so; }; })
libtool: install: /usr/bin/install -c .libs/libmca_common_dstore.lai /opt/openmpi/lib/libmca_common_dstore.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[6]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/common/dstore'
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/common/dstore'
Making install in mca/bfrops/v12
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/bfrops/v12'
  CC       bfrop_v12_component.lo
  CC       bfrop_v12.lo
  CC       pack.lo
  CC       unpack.lo
  CC       copy.lo
  CC       print.lo
  CCLD     mca_bfrops_v12.la
make[6]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/bfrops/v12'
make[6]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/pmix'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_bfrops_v12.la '/opt/openmpi/lib/pmix'
libtool: install: /usr/bin/install -c .libs/mca_bfrops_v12.so /opt/openmpi/lib/pmix/mca_bfrops_v12.so
libtool: install: /usr/bin/install -c .libs/mca_bfrops_v12.lai /opt/openmpi/lib/pmix/mca_bfrops_v12.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/pmix
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/pmix

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[6]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/bfrops/v12'
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/bfrops/v12'
Making install in mca/bfrops/v20
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/bfrops/v20'
  CC       bfrop_pmix20_component.lo
  CC       bfrop_pmix20.lo
  CC       copy.lo
  CC       pack.lo
  CC       print.lo
  CC       unpack.lo
  CCLD     mca_bfrops_v20.la
make[6]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/bfrops/v20'
make[6]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/pmix'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_bfrops_v20.la '/opt/openmpi/lib/pmix'
libtool: install: /usr/bin/install -c .libs/mca_bfrops_v20.so /opt/openmpi/lib/pmix/mca_bfrops_v20.so
libtool: install: /usr/bin/install -c .libs/mca_bfrops_v20.lai /opt/openmpi/lib/pmix/mca_bfrops_v20.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/pmix
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/pmix

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[6]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/bfrops/v20'
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/bfrops/v20'
Making install in mca/bfrops/v21
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/bfrops/v21'
  CC       bfrop_pmix21_component.lo
  CC       bfrop_pmix21.lo
  CCLD     mca_bfrops_v21.la
make[6]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/bfrops/v21'
make[6]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/pmix'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_bfrops_v21.la '/opt/openmpi/lib/pmix'
libtool: install: /usr/bin/install -c .libs/mca_bfrops_v21.so /opt/openmpi/lib/pmix/mca_bfrops_v21.so
libtool: install: /usr/bin/install -c .libs/mca_bfrops_v21.lai /opt/openmpi/lib/pmix/mca_bfrops_v21.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/pmix
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/pmix

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[6]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/bfrops/v21'
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/bfrops/v21'
Making install in mca/bfrops/v3
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/bfrops/v3'
  CC       bfrop_pmix3_component.lo
  CC       bfrop_pmix3.lo
  CCLD     mca_bfrops_v3.la
make[6]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/bfrops/v3'
make[6]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/pmix'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_bfrops_v3.la '/opt/openmpi/lib/pmix'
libtool: install: /usr/bin/install -c .libs/mca_bfrops_v3.so /opt/openmpi/lib/pmix/mca_bfrops_v3.so
libtool: install: /usr/bin/install -c .libs/mca_bfrops_v3.lai /opt/openmpi/lib/pmix/mca_bfrops_v3.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/pmix
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/pmix

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[6]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/bfrops/v3'
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/bfrops/v3'
Making install in mca/gds/ds12
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/gds/ds12'
  CC       gds_ds12_base.lo
  CC       gds_ds12_component.lo
  CC       gds_ds12_file.lo
  CC       gds_ds12_lock.lo
  CC       gds_ds20_file.lo
  CC       gds_ds12_lock_pthread.lo
  CCLD     mca_gds_ds12.la
make[6]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/gds/ds12'
make[6]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/pmix'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_gds_ds12.la '/opt/openmpi/lib/pmix'
libtool: warning: relinking 'mca_gds_ds12.la'
libtool: install: (cd /scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/gds/ds12; /bin/bash "/scratch/build/opal/mca/pmix/pmix3x/pmix/libtool"  --silent --tag CC --mode=relink gcc -DNDEBUG -O3 -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version /scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/common/dstore/libmca_common_dstore.la -o mca_gds_ds12.la -rpath /opt/openmpi/lib/pmix gds_ds12_base.lo gds_ds12_lock.lo gds_ds12_component.lo gds_ds12_file.lo gds_ds20_file.lo gds_ds12_lock_pthread.lo -lm -lutil -ldl )
libtool: install: /usr/bin/install -c .libs/mca_gds_ds12.soT /opt/openmpi/lib/pmix/mca_gds_ds12.so
libtool: install: /usr/bin/install -c .libs/mca_gds_ds12.lai /opt/openmpi/lib/pmix/mca_gds_ds12.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/pmix
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/pmix

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[6]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/gds/ds12'
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/gds/ds12'
Making install in mca/gds/ds21
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/gds/ds21'
  CC       gds_ds21_base.lo
  CC       gds_ds21_lock.lo
  CC       gds_ds21_lock_pthread.lo
  CC       gds_ds21_component.lo
  CC       gds_ds21_file.lo
  CCLD     mca_gds_ds21.la
make[6]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/gds/ds21'
make[6]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/pmix'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_gds_ds21.la '/opt/openmpi/lib/pmix'
libtool: warning: relinking 'mca_gds_ds21.la'
libtool: install: (cd /scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/gds/ds21; /bin/bash "/scratch/build/opal/mca/pmix/pmix3x/pmix/libtool"  --silent --tag CC --mode=relink gcc -DNDEBUG -O3 -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version /scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/common/dstore/libmca_common_dstore.la -o mca_gds_ds21.la -rpath /opt/openmpi/lib/pmix gds_ds21_base.lo gds_ds21_lock.lo gds_ds21_lock_pthread.lo gds_ds21_component.lo gds_ds21_file.lo -lm -lutil -ldl )
libtool: install: /usr/bin/install -c .libs/mca_gds_ds21.soT /opt/openmpi/lib/pmix/mca_gds_ds21.so
libtool: install: /usr/bin/install -c .libs/mca_gds_ds21.lai /opt/openmpi/lib/pmix/mca_gds_ds21.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/pmix
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/pmix

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[6]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/gds/ds21'
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/gds/ds21'
Making install in mca/gds/hash
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/gds/hash'
  CC       gds_hash_component.lo
  CC       gds_hash.lo
  CCLD     mca_gds_hash.la
make[6]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/gds/hash'
make[6]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/pmix'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_gds_hash.la '/opt/openmpi/lib/pmix'
libtool: install: /usr/bin/install -c .libs/mca_gds_hash.so /opt/openmpi/lib/pmix/mca_gds_hash.so
libtool: install: /usr/bin/install -c .libs/mca_gds_hash.lai /opt/openmpi/lib/pmix/mca_gds_hash.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/pmix
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/pmix

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[6]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/gds/hash'
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/gds/hash'
Making install in mca/plog/default
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/plog/default'
  CC       plog_default.lo
  CC       plog_default_component.lo
  CCLD     mca_plog_default.la
make[6]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/plog/default'
make[6]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/pmix'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_plog_default.la '/opt/openmpi/lib/pmix'
libtool: install: /usr/bin/install -c .libs/mca_plog_default.so /opt/openmpi/lib/pmix/mca_plog_default.so
libtool: install: /usr/bin/install -c .libs/mca_plog_default.lai /opt/openmpi/lib/pmix/mca_plog_default.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/pmix
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/pmix

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[6]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/plog/default'
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/plog/default'
Making install in mca/plog/stdfd
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/plog/stdfd'
  CC       plog_stdfd.lo
  CC       plog_stdfd_component.lo
 ---> Removed intermediate container 66023e29de7c
 ---> f952b5dbf850
Step 10/18 : ENV PATH=${CMAKE_DIR}/bin:$PATH
 ---> Running in 4c2f23eaa633
  CCLD     mca_plog_stdfd.la
make[6]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/plog/stdfd'
make[6]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/pmix'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_plog_stdfd.la '/opt/openmpi/lib/pmix'
libtool: install: /usr/bin/install -c .libs/mca_plog_stdfd.so /opt/openmpi/lib/pmix/mca_plog_stdfd.so
libtool: install: /usr/bin/install -c .libs/mca_plog_stdfd.lai /opt/openmpi/lib/pmix/mca_plog_stdfd.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/pmix
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/pmix

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[6]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/plog/stdfd'
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/plog/stdfd'
Making install in mca/plog/syslog
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/plog/syslog'
  CC       plog_syslog.lo
  CC       plog_syslog_component.lo
  CCLD     mca_plog_syslog.la
make[6]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/plog/syslog'
make[6]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/pmix'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_plog_syslog.la '/opt/openmpi/lib/pmix'
libtool: install: /usr/bin/install -c .libs/mca_plog_syslog.so /opt/openmpi/lib/pmix/mca_plog_syslog.so
libtool: install: /usr/bin/install -c .libs/mca_plog_syslog.lai /opt/openmpi/lib/pmix/mca_plog_syslog.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/pmix
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/pmix

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[6]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/plog/syslog'
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/plog/syslog'
Making install in mca/pnet/tcp
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pnet/tcp'
  CC       pnet_tcp_component.lo
  CC       pnet_tcp.lo
  CCLD     mca_pnet_tcp.la
make[6]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pnet/tcp'
make[6]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/pmix'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_pnet_tcp.la '/opt/openmpi/lib/pmix'
libtool: install: /usr/bin/install -c .libs/mca_pnet_tcp.so /opt/openmpi/lib/pmix/mca_pnet_tcp.so
libtool: install: /usr/bin/install -c .libs/mca_pnet_tcp.lai /opt/openmpi/lib/pmix/mca_pnet_tcp.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/pmix
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/pmix

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[6]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pnet/tcp'
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pnet/tcp'
Making install in mca/pnet/test
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pnet/test'
  CC       pnet_test_component.lo
  CC       pnet_test.lo
  CCLD     mca_pnet_test.la
make[6]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pnet/test'
make[6]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/pmix'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_pnet_test.la '/opt/openmpi/lib/pmix'
libtool: install: /usr/bin/install -c .libs/mca_pnet_test.so /opt/openmpi/lib/pmix/mca_pnet_test.so
libtool: install: /usr/bin/install -c .libs/mca_pnet_test.lai /opt/openmpi/lib/pmix/mca_pnet_test.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/pmix
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/pmix

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[6]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pnet/test'
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pnet/test'
Making install in mca/preg/native
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/preg/native'
  CC       preg_native_component.lo
  CC       preg_native.lo
  CCLD     mca_preg_native.la
make[6]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/preg/native'
make[6]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/pmix'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_preg_native.la '/opt/openmpi/lib/pmix'
libtool: install: /usr/bin/install -c .libs/mca_preg_native.so /opt/openmpi/lib/pmix/mca_preg_native.so
libtool: install: /usr/bin/install -c .libs/mca_preg_native.lai /opt/openmpi/lib/pmix/mca_preg_native.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/pmix
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/pmix

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[6]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/preg/native'
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/preg/native'
Making install in mca/psec/native
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/psec/native'
  CC       psec_native_component.lo
  CC       psec_native.lo
  CCLD     mca_psec_native.la
make[6]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/psec/native'
make[6]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/pmix'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_psec_native.la '/opt/openmpi/lib/pmix'
libtool: install: /usr/bin/install -c .libs/mca_psec_native.so /opt/openmpi/lib/pmix/mca_psec_native.so
libtool: install: /usr/bin/install -c .libs/mca_psec_native.lai /opt/openmpi/lib/pmix/mca_psec_native.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/pmix
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/pmix

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[6]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/psec/native'
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/psec/native'
Making install in mca/psec/none
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/psec/none'
  CC       psec_none_component.lo
  CC       psec_none.lo
  CCLD     mca_psec_none.la
make[6]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/psec/none'
make[6]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/pmix'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_psec_none.la '/opt/openmpi/lib/pmix'
libtool: install: /usr/bin/install -c .libs/mca_psec_none.so /opt/openmpi/lib/pmix/mca_psec_none.so
libtool: install: /usr/bin/install -c .libs/mca_psec_none.lai /opt/openmpi/lib/pmix/mca_psec_none.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/pmix
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/pmix

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[6]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/psec/none'
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/psec/none'
Making install in mca/psensor/file
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/psensor/file'
  CC       psensor_file.lo
  CC       psensor_file_component.lo
  CCLD     mca_psensor_file.la
make[6]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/psensor/file'
make[6]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/pmix'
 /usr/bin/install -c -m 644 ../../../../../../../../../../openmpi/opal/mca/pmix/pmix3x/pmix/src/mca/psensor/file/help-pmix-psensor-file.txt '/opt/openmpi/share/pmix'
 /usr/bin/mkdir -p '/opt/openmpi/lib/pmix'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_psensor_file.la '/opt/openmpi/lib/pmix'
libtool: install: /usr/bin/install -c .libs/mca_psensor_file.so /opt/openmpi/lib/pmix/mca_psensor_file.so
libtool: install: /usr/bin/install -c .libs/mca_psensor_file.lai /opt/openmpi/lib/pmix/mca_psensor_file.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/pmix
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/pmix

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[6]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/psensor/file'
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/psensor/file'
Making install in mca/psensor/heartbeat
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/psensor/heartbeat'
  CC       psensor_heartbeat.lo
  CC       psensor_heartbeat_component.lo
  CCLD     mca_psensor_heartbeat.la
make[6]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/psensor/heartbeat'
make[6]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/pmix'
 /usr/bin/install -c -m 644 ../../../../../../../../../../openmpi/opal/mca/pmix/pmix3x/pmix/src/mca/psensor/heartbeat/help-pmix-psensor-heartbeat.txt '/opt/openmpi/share/pmix'
 /usr/bin/mkdir -p '/opt/openmpi/lib/pmix'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_psensor_heartbeat.la '/opt/openmpi/lib/pmix'
libtool: install: /usr/bin/install -c .libs/mca_psensor_heartbeat.so /opt/openmpi/lib/pmix/mca_psensor_heartbeat.so
libtool: install: /usr/bin/install -c .libs/mca_psensor_heartbeat.lai /opt/openmpi/lib/pmix/mca_psensor_heartbeat.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/pmix
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/pmix

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[6]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/psensor/heartbeat'
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/psensor/heartbeat'
Making install in mca/pshmem/mmap
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pshmem/mmap'
  CC       pshmem_mmap.lo
  CC       pshmem_mmap_component.lo
  CCLD     mca_pshmem_mmap.la
make[6]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pshmem/mmap'
make[6]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/pmix'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_pshmem_mmap.la '/opt/openmpi/lib/pmix'
libtool: install: /usr/bin/install -c .libs/mca_pshmem_mmap.so /opt/openmpi/lib/pmix/mca_pshmem_mmap.so
libtool: install: /usr/bin/install -c .libs/mca_pshmem_mmap.lai /opt/openmpi/lib/pmix/mca_pshmem_mmap.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/pmix
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/pmix

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[6]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pshmem/mmap'
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/pshmem/mmap'
Making install in mca/ptl/tcp
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/ptl/tcp'
  CC       ptl_tcp_component.lo
  CC       ptl_tcp.lo
  CCLD     mca_ptl_tcp.la
make[6]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/ptl/tcp'
make[6]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/pmix'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_ptl_tcp.la '/opt/openmpi/lib/pmix'
libtool: install: /usr/bin/install -c .libs/mca_ptl_tcp.so /opt/openmpi/lib/pmix/mca_ptl_tcp.so
libtool: install: /usr/bin/install -c .libs/mca_ptl_tcp.lai /opt/openmpi/lib/pmix/mca_ptl_tcp.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/pmix
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/pmix

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[6]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/ptl/tcp'
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/ptl/tcp'
Making install in mca/ptl/usock
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/ptl/usock'
  CC       ptl_usock_component.lo
  CC       ptl_usock.lo
  CCLD     mca_ptl_usock.la
make[6]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/ptl/usock'
make[6]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/pmix'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_ptl_usock.la '/opt/openmpi/lib/pmix'
libtool: install: /usr/bin/install -c .libs/mca_ptl_usock.so /opt/openmpi/lib/pmix/mca_ptl_usock.so
libtool: install: /usr/bin/install -c .libs/mca_ptl_usock.lai /opt/openmpi/lib/pmix/mca_ptl_usock.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/pmix
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/pmix

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[6]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/ptl/usock'
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/mca/ptl/usock'
Making install in tools/pevent
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/tools/pevent'
make[6]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/tools/pevent'
make[6]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/tools/pevent'
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/tools/pevent'
Making install in tools/pmix_info
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/tools/pmix_info'
make[6]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/tools/pmix_info'
make[6]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/tools/pmix_info'
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/tools/pmix_info'
Making install in tools/plookup
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/tools/plookup'
make[6]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/tools/plookup'
make[6]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/tools/plookup'
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/tools/plookup'
Making install in tools/pps
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/tools/pps'
make[6]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/tools/pps'
make[6]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/tools/pps'
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src/tools/pps'
make[4]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/src'
Making install in etc
make[4]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/etc'
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/etc'
make[5]: Nothing to be done for 'install-exec-am'.
/usr/bin/mkdir -p /opt/openmpi/etc
 /usr/bin/install -c -m 644 ../../../../../../../openmpi/opal/mca/pmix/pmix3x/pmix/etc/pmix-mca-params.conf /opt/openmpi/etc/pmix-mca-params.conf
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/etc'
make[4]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix/etc'
make[4]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix'
make[5]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x/pmix'
make[5]: Nothing to be done for 'install-exec-am'.
make[5]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix'
make[4]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix'
make[3]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x/pmix'
make[3]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x'
  CC       mca_pmix_pmix3x_la-pmix3x_component.lo
  CC       mca_pmix_pmix3x_la-pmix3x_client.lo
  CC       mca_pmix_pmix3x_la-pmix3x.lo
  CC       mca_pmix_pmix3x_la-pmix3x_local.lo
  CC       mca_pmix_pmix3x_la-pmix3x_server_south.lo
  CC       mca_pmix_pmix3x_la-pmix3x_server_north.lo
  CCLD     mca_pmix_pmix3x.la
make[4]: Entering directory '/scratch/build/opal/mca/pmix/pmix3x'
make[4]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../../../../openmpi/opal/mca/pmix/pmix3x/help-pmix-pmix3x.txt '/opt/openmpi/share/openmpi'
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_pmix_pmix3x.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_pmix_pmix3x.la'
libtool: install: (cd /scratch/build/opal/mca/pmix/pmix3x; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_pmix_pmix3x.la -rpath /opt/openmpi/lib/openmpi mca_pmix_pmix3x_la-pmix3x_component.lo mca_pmix_pmix3x_la-pmix3x.lo mca_pmix_pmix3x_la-pmix3x_client.lo mca_pmix_pmix3x_la-pmix3x_local.lo mca_pmix_pmix3x_la-pmix3x_server_south.lo mca_pmix_pmix3x_la-pmix3x_server_north.lo ../../../../opal/libopen-pal.la /scratch/build/opal/mca/pmix/pmix3x/pmix/src/libpmix.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_pmix_pmix3x.soT /opt/openmpi/lib/openmpi/mca_pmix_pmix3x.so
libtool: install: /usr/bin/install -c .libs/mca_pmix_pmix3x.lai /opt/openmpi/lib/openmpi/mca_pmix_pmix3x.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[4]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x'
make[3]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x'
make[2]: Leaving directory '/scratch/build/opal/mca/pmix/pmix3x'
Making install in mca/pstat/linux
make[2]: Entering directory '/scratch/build/opal/mca/pstat/linux'
  CC       pstat_linux_component.lo
  CC       pstat_linux_module.lo
  CCLD     mca_pstat_linux.la
make[3]: Entering directory '/scratch/build/opal/mca/pstat/linux'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_pstat_linux.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_pstat_linux.la'
libtool: install: (cd /scratch/build/opal/mca/pstat/linux; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_pstat_linux.la -rpath /opt/openmpi/lib/openmpi pstat_linux_component.lo pstat_linux_module.lo ../../../../opal/libopen-pal.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_pstat_linux.soT /opt/openmpi/lib/openmpi/mca_pstat_linux.so
libtool: install: /usr/bin/install -c .libs/mca_pstat_linux.lai /opt/openmpi/lib/openmpi/mca_pstat_linux.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/opal/mca/pstat/linux'
make[2]: Leaving directory '/scratch/build/opal/mca/pstat/linux'
Making install in mca/rcache/grdma
make[2]: Entering directory '/scratch/build/opal/mca/rcache/grdma'
  CC       rcache_grdma_module.lo
  CC       rcache_grdma_component.lo
  CCLD     mca_rcache_grdma.la
make[3]: Entering directory '/scratch/build/opal/mca/rcache/grdma'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_rcache_grdma.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_rcache_grdma.la'
libtool: install: (cd /scratch/build/opal/mca/rcache/grdma; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_rcache_grdma.la -rpath /opt/openmpi/lib/openmpi rcache_grdma_module.lo rcache_grdma_component.lo ../../../../opal/libopen-pal.la /scratch/build/opal/mca/common/cuda/libmca_common_cuda.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_rcache_grdma.soT /opt/openmpi/lib/openmpi/mca_rcache_grdma.so
libtool: install: /usr/bin/install -c .libs/mca_rcache_grdma.lai /opt/openmpi/lib/openmpi/mca_rcache_grdma.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/opal/mca/rcache/grdma'
make[2]: Leaving directory '/scratch/build/opal/mca/rcache/grdma'
Making install in mca/rcache/gpusm
make[2]: Entering directory '/scratch/build/opal/mca/rcache/gpusm'
  CC       rcache_gpusm_module.lo
  CC       rcache_gpusm_component.lo
  CCLD     mca_rcache_gpusm.la
make[3]: Entering directory '/scratch/build/opal/mca/rcache/gpusm'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_rcache_gpusm.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_rcache_gpusm.la'
libtool: install: (cd /scratch/build/opal/mca/rcache/gpusm; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_rcache_gpusm.la -rpath /opt/openmpi/lib/openmpi rcache_gpusm_module.lo rcache_gpusm_component.lo ../../../../opal/libopen-pal.la /scratch/build/opal/mca/common/cuda/libmca_common_cuda.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_rcache_gpusm.soT /opt/openmpi/lib/openmpi/mca_rcache_gpusm.so
libtool: install: /usr/bin/install -c .libs/mca_rcache_gpusm.lai /opt/openmpi/lib/openmpi/mca_rcache_gpusm.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/opal/mca/rcache/gpusm'
make[2]: Leaving directory '/scratch/build/opal/mca/rcache/gpusm'
Making install in mca/rcache/rgpusm
make[2]: Entering directory '/scratch/build/opal/mca/rcache/rgpusm'
  CC       rcache_rgpusm_module.lo
  CC       rcache_rgpusm_component.lo
  CCLD     mca_rcache_rgpusm.la
make[3]: Entering directory '/scratch/build/opal/mca/rcache/rgpusm'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_rcache_rgpusm.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_rcache_rgpusm.la'
libtool: install: (cd /scratch/build/opal/mca/rcache/rgpusm; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_rcache_rgpusm.la -rpath /opt/openmpi/lib/openmpi rcache_rgpusm_module.lo rcache_rgpusm_component.lo ../../../../opal/libopen-pal.la /scratch/build/opal/mca/common/cuda/libmca_common_cuda.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_rcache_rgpusm.soT /opt/openmpi/lib/openmpi/mca_rcache_rgpusm.so
libtool: install: /usr/bin/install -c .libs/mca_rcache_rgpusm.lai /opt/openmpi/lib/openmpi/mca_rcache_rgpusm.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/opal/mca/rcache/rgpusm'
make[2]: Leaving directory '/scratch/build/opal/mca/rcache/rgpusm'
Making install in mca/reachable/weighted
make[2]: Entering directory '/scratch/build/opal/mca/reachable/weighted'
  CC       reachable_weighted_component.lo
  CC       reachable_weighted.lo
  CCLD     mca_reachable_weighted.la
make[3]: Entering directory '/scratch/build/opal/mca/reachable/weighted'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_reachable_weighted.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_reachable_weighted.la'
libtool: install: (cd /scratch/build/opal/mca/reachable/weighted; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_reachable_weighted.la -rpath /opt/openmpi/lib/openmpi reachable_weighted_component.lo reachable_weighted.lo ../../../../opal/libopen-pal.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_reachable_weighted.soT /opt/openmpi/lib/openmpi/mca_reachable_weighted.so
libtool: install: /usr/bin/install -c .libs/mca_reachable_weighted.lai /opt/openmpi/lib/openmpi/mca_reachable_weighted.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/opal/mca/reachable/weighted'
make[2]: Leaving directory '/scratch/build/opal/mca/reachable/weighted'
Making install in mca/shmem/mmap
make[2]: Entering directory '/scratch/build/opal/mca/shmem/mmap'
  CC       shmem_mmap_component.lo
  CC       shmem_mmap_module.lo
  CCLD     mca_shmem_mmap.la
make[3]: Entering directory '/scratch/build/opal/mca/shmem/mmap'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../../../../openmpi/opal/mca/shmem/mmap/help-opal-shmem-mmap.txt '/opt/openmpi/share/openmpi'
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_shmem_mmap.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_shmem_mmap.la'
libtool: install: (cd /scratch/build/opal/mca/shmem/mmap; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_shmem_mmap.la -rpath /opt/openmpi/lib/openmpi shmem_mmap_component.lo shmem_mmap_module.lo ../../../../opal/libopen-pal.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_shmem_mmap.soT /opt/openmpi/lib/openmpi/mca_shmem_mmap.so
libtool: install: /usr/bin/install -c .libs/mca_shmem_mmap.lai /opt/openmpi/lib/openmpi/mca_shmem_mmap.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/opal/mca/shmem/mmap'
make[2]: Leaving directory '/scratch/build/opal/mca/shmem/mmap'
Making install in mca/shmem/posix
make[2]: Entering directory '/scratch/build/opal/mca/shmem/posix'
  CC       shmem_posix_common_utils.lo
  CC       shmem_posix_component.lo
  CC       shmem_posix_module.lo
  CCLD     mca_shmem_posix.la
make[3]: Entering directory '/scratch/build/opal/mca/shmem/posix'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../../../../openmpi/opal/mca/shmem/posix/help-opal-shmem-posix.txt '/opt/openmpi/share/openmpi'
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_shmem_posix.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_shmem_posix.la'
libtool: install: (cd /scratch/build/opal/mca/shmem/posix; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_shmem_posix.la -rpath /opt/openmpi/lib/openmpi shmem_posix_common_utils.lo shmem_posix_component.lo shmem_posix_module.lo ../../../../opal/libopen-pal.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_shmem_posix.soT /opt/openmpi/lib/openmpi/mca_shmem_posix.so
libtool: install: /usr/bin/install -c .libs/mca_shmem_posix.lai /opt/openmpi/lib/openmpi/mca_shmem_posix.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/opal/mca/shmem/posix'
make[2]: Leaving directory '/scratch/build/opal/mca/shmem/posix'
Making install in mca/shmem/sysv
make[2]: Entering directory '/scratch/build/opal/mca/shmem/sysv'
  CC       shmem_sysv_module.lo
  CC       shmem_sysv_component.lo
  CCLD     mca_shmem_sysv.la
make[3]: Entering directory '/scratch/build/opal/mca/shmem/sysv'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../../../../openmpi/opal/mca/shmem/sysv/help-opal-shmem-sysv.txt '/opt/openmpi/share/openmpi'
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_shmem_sysv.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_shmem_sysv.la'
libtool: install: (cd /scratch/build/opal/mca/shmem/sysv; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_shmem_sysv.la -rpath /opt/openmpi/lib/openmpi shmem_sysv_component.lo shmem_sysv_module.lo ../../../../opal/libopen-pal.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_shmem_sysv.soT /opt/openmpi/lib/openmpi/mca_shmem_sysv.so
libtool: install: /usr/bin/install -c .libs/mca_shmem_sysv.lai /opt/openmpi/lib/openmpi/mca_shmem_sysv.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/opal/mca/shmem/sysv'
make[2]: Leaving directory '/scratch/build/opal/mca/shmem/sysv'
Making install in tools/wrappers
make[2]: Entering directory '/scratch/build/opal/tools/wrappers'
  CC       opal_wrapper.o
  GENERATE opal_wrapper.1
  CCLD     opal_wrapper
make[3]: Entering directory '/scratch/build/opal/tools/wrappers'
 /usr/bin/mkdir -p '/opt/openmpi/bin'
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/mkdir -p '/opt/openmpi/share/man/man1'
 /usr/bin/install -c -m 644 ../../../../openmpi/opal/tools/wrappers/help-opal-wrapper.txt '/opt/openmpi/share/openmpi'
  /bin/bash ../../../libtool   --mode=install /usr/bin/install -c opal_wrapper '/opt/openmpi/bin'
 /usr/bin/install -c -m 644 opal_wrapper.1 '/opt/openmpi/share/man/man1'
libtool: install: /usr/bin/install -c .libs/opal_wrapper /opt/openmpi/bin/opal_wrapper
make  install-exec-hook
make[4]: Entering directory '/scratch/build/opal/tools/wrappers'
make[4]: Nothing to be done for 'install-exec-hook'.
make[4]: Leaving directory '/scratch/build/opal/tools/wrappers'
make[3]: Leaving directory '/scratch/build/opal/tools/wrappers'
make[2]: Leaving directory '/scratch/build/opal/tools/wrappers'
make[1]: Leaving directory '/scratch/build/opal'
Making install in orte
make[1]: Entering directory '/scratch/build/orte'
Making install in include
make[2]: Entering directory '/scratch/build/orte/include'
make[3]: Entering directory '/scratch/build/orte/include'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Leaving directory '/scratch/build/orte/include'
make[2]: Leaving directory '/scratch/build/orte/include'
Making install in mca/common
make[2]: Entering directory '/scratch/build/orte/mca/common'
make[3]: Entering directory '/scratch/build/orte/mca/common'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Nothing to be done for 'install-data-am'.
make[3]: Leaving directory '/scratch/build/orte/mca/common'
make[2]: Leaving directory '/scratch/build/orte/mca/common'
Making install in mca/errmgr
make[2]: Entering directory '/scratch/build/orte/mca/errmgr'
  CC       base/errmgr_base_select.lo
  CC       base/errmgr_base_frame.lo
  CC       base/errmgr_base_fns.lo
  CCLD     libmca_errmgr.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/orte/mca/errmgr'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../../../openmpi/orte/mca/errmgr/base/help-errmgr-base.txt '/opt/openmpi/share/openmpi'
make[3]: Leaving directory '/scratch/build/orte/mca/errmgr'
make[2]: Leaving directory '/scratch/build/orte/mca/errmgr'
Making install in mca/ess
make[2]: Entering directory '/scratch/build/orte/mca/ess'
  CC       base/ess_base_frame.lo
  CC       base/ess_base_select.lo
  CC       base/ess_base_std_tool.lo
  CC       base/ess_base_get.lo
  CC       base/ess_base_std_orted.lo
  CC       base/ess_base_std_prolog.lo
  CC       base/ess_base_fns.lo
  CCLD     libmca_ess.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/orte/mca/ess'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../../../openmpi/orte/mca/ess/base/help-ess-base.txt '/opt/openmpi/share/openmpi'
make[3]: Leaving directory '/scratch/build/orte/mca/ess'
make[2]: Leaving directory '/scratch/build/orte/mca/ess'
Making install in mca/filem
make[2]: Entering directory '/scratch/build/orte/mca/filem'
  GENERATE orte_filem.7
  CC       base/filem_base_frame.lo
  CC       base/filem_base_select.lo
  CC       base/filem_base_receive.lo
  CC       base/filem_base_fns.lo
  CCLD     libmca_filem.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/orte/mca/filem'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/man/man7'
 /usr/bin/install -c -m 644 orte_filem.7 '/opt/openmpi/share/man/man7'
make[3]: Leaving directory '/scratch/build/orte/mca/filem'
make[2]: Leaving directory '/scratch/build/orte/mca/filem'
Making install in mca/grpcomm
make[2]: Entering directory '/scratch/build/orte/mca/grpcomm'
  CC       base/grpcomm_base_select.lo
  CC       base/grpcomm_base_frame.lo
  CC       base/grpcomm_base_stubs.lo
  CCLD     libmca_grpcomm.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/orte/mca/grpcomm'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Leaving directory '/scratch/build/orte/mca/grpcomm'
make[2]: Leaving directory '/scratch/build/orte/mca/grpcomm'
Making install in mca/iof
make[2]: Entering directory '/scratch/build/orte/mca/iof'
  CC       base/iof_base_frame.lo
  CC       base/iof_base_select.lo
  CC       base/iof_base_output.lo
  CC       base/iof_base_setup.lo
  CCLD     libmca_iof.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/orte/mca/iof'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Leaving directory '/scratch/build/orte/mca/iof'
make[2]: Leaving directory '/scratch/build/orte/mca/iof'
Making install in mca/odls
make[2]: Entering directory '/scratch/build/orte/mca/odls'
  CC       base/odls_base_frame.lo
  CC       base/odls_base_select.lo
  CC       base/odls_base_default_fns.lo
  CCLD     libmca_odls.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/orte/mca/odls'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../../../openmpi/orte/mca/odls/base/help-orte-odls-base.txt '/opt/openmpi/share/openmpi'
make[3]: Leaving directory '/scratch/build/orte/mca/odls'
make[2]: Leaving directory '/scratch/build/orte/mca/odls'
Making install in mca/oob
make[2]: Entering directory '/scratch/build/orte/mca/oob'
  CC       base/oob_base_frame.lo
  CC       base/oob_base_select.lo
  CC       base/oob_base_stubs.lo
  CCLD     libmca_oob.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/orte/mca/oob'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../../../openmpi/orte/mca/oob/base/help-oob-base.txt '/opt/openmpi/share/openmpi'
make[3]: Leaving directory '/scratch/build/orte/mca/oob'
make[2]: Leaving directory '/scratch/build/orte/mca/oob'
Making install in mca/plm
make[2]: Entering directory '/scratch/build/orte/mca/plm'
  CC       base/plm_base_frame.lo
  CC       base/plm_base_select.lo
  CC       base/plm_base_launch_support.lo
  CC       base/plm_base_receive.lo
  CC       base/plm_base_jobid.lo
  CC       base/plm_base_orted_cmds.lo
  CCLD     libmca_plm.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/orte/mca/plm'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../../../openmpi/orte/mca/plm/base/help-plm-base.txt '/opt/openmpi/share/openmpi'
make[3]: Leaving directory '/scratch/build/orte/mca/plm'
make[2]: Leaving directory '/scratch/build/orte/mca/plm'
Making install in mca/ras
make[2]: Entering directory '/scratch/build/orte/mca/ras'
  CC       base/ras_base_frame.lo
  CC       base/ras_base_select.lo
  CC       base/ras_base_node.lo
  CC       base/ras_base_allocate.lo
  CCLD     libmca_ras.la
 ---> Removed intermediate container 4c2f23eaa633
 ---> 28daa78ed186
Step 11/18 : ARG DPCPP_VERSION=2023.0.0
 ---> Running in a94653e9edda
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/orte/mca/ras'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../../../openmpi/orte/mca/ras/base/help-ras-base.txt '/opt/openmpi/share/openmpi'
make[3]: Leaving directory '/scratch/build/orte/mca/ras'
make[2]: Leaving directory '/scratch/build/orte/mca/ras'
Making install in mca/regx
make[2]: Entering directory '/scratch/build/orte/mca/regx'
  CC       base/regx_base_default_fns.lo
  CC       base/regx_base_select.lo
  CC       base/regx_base_frame.lo
  CCLD     libmca_regx.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/orte/mca/regx'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Leaving directory '/scratch/build/orte/mca/regx'
make[2]: Leaving directory '/scratch/build/orte/mca/regx'
Making install in mca/rmaps
make[2]: Entering directory '/scratch/build/orte/mca/rmaps'
  CC       base/rmaps_base_frame.lo
  CC       base/rmaps_base_select.lo
  CC       base/rmaps_base_map_job.lo
  CC       base/rmaps_base_support_fns.lo
  CC       base/rmaps_base_ranking.lo
  CC       base/rmaps_base_print_fns.lo
  CC       base/rmaps_base_binding.lo
  CC       base/rmaps_base_assign_locations.lo
  CCLD     libmca_rmaps.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/orte/mca/rmaps'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../../../openmpi/orte/mca/rmaps/base/help-orte-rmaps-base.txt '/opt/openmpi/share/openmpi'
make[3]: Leaving directory '/scratch/build/orte/mca/rmaps'
make[2]: Leaving directory '/scratch/build/orte/mca/rmaps'
Making install in mca/rml
make[2]: Entering directory '/scratch/build/orte/mca/rml'
  CC       base/rml_base_frame.lo
  CC       base/rml_base_contact.lo
  CC       base/rml_base_msg_handlers.lo
  CC       base/rml_base_stubs.lo
  CCLD     libmca_rml.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/orte/mca/rml'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Leaving directory '/scratch/build/orte/mca/rml'
make[2]: Leaving directory '/scratch/build/orte/mca/rml'
Making install in mca/routed
make[2]: Entering directory '/scratch/build/orte/mca/routed'
  CC       base/routed_base_frame.lo
  CC       base/routed_base_fns.lo
  CCLD     libmca_routed.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/orte/mca/routed'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Leaving directory '/scratch/build/orte/mca/routed'
make[2]: Leaving directory '/scratch/build/orte/mca/routed'
Making install in mca/rtc
make[2]: Entering directory '/scratch/build/orte/mca/rtc'
  CC       base/rtc_base_frame.lo
  CC       base/rtc_base_select.lo
  CC       base/rtc_base_stubs.lo
  CCLD     libmca_rtc.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/orte/mca/rtc'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../../../openmpi/orte/mca/rtc/base/help-orte-rtc-base.txt '/opt/openmpi/share/openmpi'
make[3]: Leaving directory '/scratch/build/orte/mca/rtc'
make[2]: Leaving directory '/scratch/build/orte/mca/rtc'
Making install in mca/schizo
make[2]: Entering directory '/scratch/build/orte/mca/schizo'
  CC       base/schizo_base_frame.lo
  CC       base/schizo_base_select.lo
  CC       base/schizo_base_stubs.lo
  CCLD     libmca_schizo.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/orte/mca/schizo'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Leaving directory '/scratch/build/orte/mca/schizo'
make[2]: Leaving directory '/scratch/build/orte/mca/schizo'
Making install in mca/snapc
make[2]: Entering directory '/scratch/build/orte/mca/snapc'
  GENERATE orte_snapc.7
  CC       base/snapc_base_frame.lo
  CC       base/snapc_base_select.lo
  CC       base/snapc_base_fns.lo
  CCLD     libmca_snapc.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/orte/mca/snapc'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/mkdir -p '/opt/openmpi/share/man/man7'
 /usr/bin/install -c -m 644 ../../../../openmpi/orte/mca/snapc/base/help-orte-snapc-base.txt '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 orte_snapc.7 '/opt/openmpi/share/man/man7'
make[3]: Leaving directory '/scratch/build/orte/mca/snapc'
make[2]: Leaving directory '/scratch/build/orte/mca/snapc'
Making install in mca/sstore
make[2]: Entering directory '/scratch/build/orte/mca/sstore'
  GENERATE orte_sstore.7
  CC       base/sstore_base_frame.lo
  CC       base/sstore_base_select.lo
  CC       base/sstore_base_fns.lo
  CCLD     libmca_sstore.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/orte/mca/sstore'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/man/man7'
 /usr/bin/install -c -m 644 orte_sstore.7 '/opt/openmpi/share/man/man7'
make[3]: Leaving directory '/scratch/build/orte/mca/sstore'
make[2]: Leaving directory '/scratch/build/orte/mca/sstore'
Making install in mca/state
make[2]: Entering directory '/scratch/build/orte/mca/state'
  CC       base/state_base_frame.lo
  CC       base/state_base_select.lo
  CC       base/state_base_fns.lo
  CCLD     libmca_state.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/orte/mca/state'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../../../openmpi/orte/mca/state/base/help-state-base.txt '/opt/openmpi/share/openmpi'
make[3]: Leaving directory '/scratch/build/orte/mca/state'
make[2]: Leaving directory '/scratch/build/orte/mca/state'
Making install in etc
make[2]: Entering directory '/scratch/build/orte/etc'
make[3]: Entering directory '/scratch/build/orte/etc'
make[3]: Nothing to be done for 'install-exec-am'.
/usr/bin/mkdir -p /opt/openmpi/etc
 /usr/bin/install -c -m 644 ../../../openmpi/orte/etc/openmpi-default-hostfile /opt/openmpi/etc/openmpi-default-hostfile
make[3]: Leaving directory '/scratch/build/orte/etc'
make[2]: Leaving directory '/scratch/build/orte/etc'
Making install in .
make[2]: Entering directory '/scratch/build/orte'
  CC       util/attr.lo
  CC       util/listener.lo
  CC       util/compress.lo
  CC       runtime/libruntime_mpir_la-orte_init.lo
  CC       orted/liborted_mpir_la-orted_submit.lo
  GENERATE util/hostfile/orte_hosts.7
  CC       runtime/orte_finalize.lo
  CC       runtime/orte_locks.lo
  CC       runtime/orte_globals.lo
  CC       runtime/orte_quit.lo
  CC       runtime/data_type_support/orte_dt_compare_fns.lo
  CC       runtime/data_type_support/orte_dt_copy_fns.lo
  CC       runtime/data_type_support/orte_dt_print_fns.lo
  CC       runtime/data_type_support/orte_dt_packing_fns.lo
  CC       runtime/data_type_support/orte_dt_unpacking_fns.lo
  CC       runtime/orte_mca_params.lo
  CC       runtime/orte_wait.lo
  CC       runtime/orte_cr.lo
  CC       runtime/orte_data_server.lo
  CC       runtime/orte_info_support.lo
  CC       util/error_strings.lo
  CC       util/name_fns.lo
  CC       util/proc_info.lo
  CC       util/session_dir.lo
  CC       util/show_help.lo
  CC       util/context_fns.lo
  CC       util/parse_options.lo
  CC       util/pre_condition_transports.lo
  CC       util/hnp_contact.lo
  CC       util/hostfile/hostfile_lex.lo
  CC       util/hostfile/hostfile.lo
  CC       util/dash_host/dash_host.lo
  CC       util/comm/comm.lo
  CC       orted/orted_main.lo
  CC       orted/orted_comm.lo
  CC       orted/pmix/pmix_server.lo
  CC       orted/pmix/pmix_server_fence.lo
  CC       orted/pmix/pmix_server_register_fns.lo
  CC       orted/pmix/pmix_server_dyn.lo
  CC       orted/pmix/pmix_server_pub.lo
  CC       orted/pmix/pmix_server_gen.lo
  CCLD     libruntime_mpir.la
  CCLD     liborted_mpir.la
ar: `u' modifier ignored since `D' is the default (see `U')
ar: `u' modifier ignored since `D' is the default (see `U')
  CCLD     libopen-rte.la
make[3]: Entering directory '/scratch/build/orte'
 /usr/bin/mkdir -p '/opt/openmpi/lib'
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/mkdir -p '/opt/openmpi/share/man/man7'
 /bin/bash ../libtool   --mode=install /usr/bin/install -c   libopen-rte.la '/opt/openmpi/lib'
 /usr/bin/install -c -m 644 ../../openmpi/orte/runtime/help-orte-runtime.txt ../../openmpi/orte/util/hostfile/help-hostfile.txt ../../openmpi/orte/util/dash_host/help-dash-host.txt ../../openmpi/orte/util/help-regex.txt ../../openmpi/orte/orted/help-orted.txt '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 util/hostfile/orte_hosts.7 '/opt/openmpi/share/man/man7'
libtool: warning: relinking 'libopen-rte.la'
libtool: install: (cd /scratch/build/orte; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -version-info 60:2:20 -o libopen-rte.la -rpath /opt/openmpi/lib runtime/orte_finalize.lo runtime/orte_locks.lo runtime/orte_globals.lo runtime/orte_quit.lo runtime/data_type_support/orte_dt_compare_fns.lo runtime/data_type_support/orte_dt_copy_fns.lo runtime/data_type_support/orte_dt_print_fns.lo runtime/data_type_support/orte_dt_packing_fns.lo runtime/data_type_support/orte_dt_unpacking_fns.lo runtime/orte_mca_params.lo runtime/orte_wait.lo runtime/orte_cr.lo runtime/orte_data_server.lo runtime/orte_info_support.lo util/error_strings.lo util/name_fns.lo util/proc_info.lo util/session_dir.lo util/show_help.lo util/context_fns.lo util/parse_options.lo util/pre_condition_transports.lo util/hnp_contact.lo util/hostfile/hostfile_lex.lo util/hostfile/hostfile.lo util/dash_host/dash_host.lo util/comm/comm.lo util/attr.lo util/listener.lo util/compress.lo orted/orted_main.lo orted/orted_comm.lo orted/pmix/pmix_server.lo orted/pmix/pmix_server_fence.lo orted/pmix/pmix_server_register_fns.lo orted/pmix/pmix_server_dyn.lo orted/pmix/pmix_server_pub.lo orted/pmix/pmix_server_gen.lo mca/errmgr/libmca_errmgr.la mca/ess/libmca_ess.la mca/filem/libmca_filem.la mca/grpcomm/libmca_grpcomm.la mca/iof/libmca_iof.la mca/odls/libmca_odls.la mca/oob/libmca_oob.la mca/plm/libmca_plm.la mca/ras/libmca_ras.la mca/regx/libmca_regx.la mca/rmaps/libmca_rmaps.la mca/rml/libmca_rml.la mca/routed/libmca_routed.la mca/rtc/libmca_rtc.la mca/schizo/libmca_schizo.la mca/snapc/libmca_snapc.la mca/sstore/libmca_sstore.la mca/state/libmca_state.la /scratch/build/opal/libopen-pal.la libruntime_mpir.la liborted_mpir.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/libopen-rte.so.40.20.2T /opt/openmpi/lib/libopen-rte.so.40.20.2
libtool: install: (cd /opt/openmpi/lib && { ln -s -f libopen-rte.so.40.20.2 libopen-rte.so.40 || { rm -f libopen-rte.so.40 && ln -s libopen-rte.so.40.20.2 libopen-rte.so.40; }; })
libtool: install: (cd /opt/openmpi/lib && { ln -s -f libopen-rte.so.40.20.2 libopen-rte.so || { rm -f libopen-rte.so && ln -s libopen-rte.so.40.20.2 libopen-rte.so; }; })
libtool: install: /usr/bin/install -c .libs/libopen-rte.lai /opt/openmpi/lib/libopen-rte.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/orte'
make[2]: Leaving directory '/scratch/build/orte'
Making install in mca/errmgr/default_app
make[2]: Entering directory '/scratch/build/orte/mca/errmgr/default_app'
  CC       errmgr_default_app_component.lo
  CC       errmgr_default_app.lo
  CCLD     mca_errmgr_default_app.la
make[3]: Entering directory '/scratch/build/orte/mca/errmgr/default_app'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_errmgr_default_app.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_errmgr_default_app.la'
libtool: install: (cd /scratch/build/orte/mca/errmgr/default_app; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_errmgr_default_app.la -rpath /opt/openmpi/lib/openmpi errmgr_default_app_component.lo errmgr_default_app.lo ../../../../orte/libopen-rte.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_errmgr_default_app.soT /opt/openmpi/lib/openmpi/mca_errmgr_default_app.so
libtool: install: /usr/bin/install -c .libs/mca_errmgr_default_app.lai /opt/openmpi/lib/openmpi/mca_errmgr_default_app.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/orte/mca/errmgr/default_app'
make[2]: Leaving directory '/scratch/build/orte/mca/errmgr/default_app'
Making install in mca/errmgr/default_hnp
make[2]: Entering directory '/scratch/build/orte/mca/errmgr/default_hnp'
  CC       errmgr_default_hnp_component.lo
  CC       errmgr_default_hnp.lo
  CCLD     mca_errmgr_default_hnp.la
make[3]: Entering directory '/scratch/build/orte/mca/errmgr/default_hnp'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_errmgr_default_hnp.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_errmgr_default_hnp.la'
libtool: install: (cd /scratch/build/orte/mca/errmgr/default_hnp; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_errmgr_default_hnp.la -rpath /opt/openmpi/lib/openmpi errmgr_default_hnp_component.lo errmgr_default_hnp.lo ../../../../orte/libopen-rte.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_errmgr_default_hnp.soT /opt/openmpi/lib/openmpi/mca_errmgr_default_hnp.so
libtool: install: /usr/bin/install -c .libs/mca_errmgr_default_hnp.lai /opt/openmpi/lib/openmpi/mca_errmgr_default_hnp.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/orte/mca/errmgr/default_hnp'
make[2]: Leaving directory '/scratch/build/orte/mca/errmgr/default_hnp'
Making install in mca/errmgr/default_orted
make[2]: Entering directory '/scratch/build/orte/mca/errmgr/default_orted'
  CC       errmgr_default_orted_component.lo
  CC       errmgr_default_orted.lo
  CCLD     mca_errmgr_default_orted.la
make[3]: Entering directory '/scratch/build/orte/mca/errmgr/default_orted'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_errmgr_default_orted.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_errmgr_default_orted.la'
libtool: install: (cd /scratch/build/orte/mca/errmgr/default_orted; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_errmgr_default_orted.la -rpath /opt/openmpi/lib/openmpi errmgr_default_orted_component.lo errmgr_default_orted.lo ../../../../orte/libopen-rte.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_errmgr_default_orted.soT /opt/openmpi/lib/openmpi/mca_errmgr_default_orted.so
libtool: install: /usr/bin/install -c .libs/mca_errmgr_default_orted.lai /opt/openmpi/lib/openmpi/mca_errmgr_default_orted.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/orte/mca/errmgr/default_orted'
make[2]: Leaving directory '/scratch/build/orte/mca/errmgr/default_orted'
Making install in mca/errmgr/default_tool
make[2]: Entering directory '/scratch/build/orte/mca/errmgr/default_tool'
  CC       errmgr_default_tool_component.lo
  CC       errmgr_default_tool.lo
  CCLD     mca_errmgr_default_tool.la
make[3]: Entering directory '/scratch/build/orte/mca/errmgr/default_tool'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_errmgr_default_tool.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_errmgr_default_tool.la'
libtool: install: (cd /scratch/build/orte/mca/errmgr/default_tool; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_errmgr_default_tool.la -rpath /opt/openmpi/lib/openmpi errmgr_default_tool_component.lo errmgr_default_tool.lo ../../../../orte/libopen-rte.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_errmgr_default_tool.soT /opt/openmpi/lib/openmpi/mca_errmgr_default_tool.so
libtool: install: /usr/bin/install -c .libs/mca_errmgr_default_tool.lai /opt/openmpi/lib/openmpi/mca_errmgr_default_tool.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/orte/mca/errmgr/default_tool'
make[2]: Leaving directory '/scratch/build/orte/mca/errmgr/default_tool'
Making install in mca/ess/env
make[2]: Entering directory '/scratch/build/orte/mca/ess/env'
  CC       ess_env_component.lo
  CC       ess_env_module.lo
  CCLD     mca_ess_env.la
make[3]: Entering directory '/scratch/build/orte/mca/ess/env'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_ess_env.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_ess_env.la'
libtool: install: (cd /scratch/build/orte/mca/ess/env; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_ess_env.la -rpath /opt/openmpi/lib/openmpi ess_env_component.lo ess_env_module.lo ../../../../orte/libopen-rte.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_ess_env.soT /opt/openmpi/lib/openmpi/mca_ess_env.so
libtool: install: /usr/bin/install -c .libs/mca_ess_env.lai /opt/openmpi/lib/openmpi/mca_ess_env.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/orte/mca/ess/env'
make[2]: Leaving directory '/scratch/build/orte/mca/ess/env'
Making install in mca/ess/hnp
make[2]: Entering directory '/scratch/build/orte/mca/ess/hnp'
  CC       ess_hnp_component.lo
  CC       ess_hnp_module.lo
  CCLD     mca_ess_hnp.la
make[3]: Entering directory '/scratch/build/orte/mca/ess/hnp'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_ess_hnp.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_ess_hnp.la'
libtool: install: (cd /scratch/build/orte/mca/ess/hnp; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_ess_hnp.la -rpath /opt/openmpi/lib/openmpi ess_hnp_component.lo ess_hnp_module.lo ../../../../orte/libopen-rte.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_ess_hnp.soT /opt/openmpi/lib/openmpi/mca_ess_hnp.so
libtool: install: /usr/bin/install -c .libs/mca_ess_hnp.lai /opt/openmpi/lib/openmpi/mca_ess_hnp.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/orte/mca/ess/hnp'
make[2]: Leaving directory '/scratch/build/orte/mca/ess/hnp'
Making install in mca/ess/pmi
make[2]: Entering directory '/scratch/build/orte/mca/ess/pmi'
  CC       ess_pmi_component.lo
  CC       ess_pmi_module.lo
  CCLD     mca_ess_pmi.la
make[3]: Entering directory '/scratch/build/orte/mca/ess/pmi'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_ess_pmi.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_ess_pmi.la'
libtool: install: (cd /scratch/build/orte/mca/ess/pmi; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -fasynchronous-unwind-tables -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_ess_pmi.la -rpath /opt/openmpi/lib/openmpi ess_pmi_component.lo ess_pmi_module.lo ../../../../orte/libopen-rte.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_ess_pmi.soT /opt/openmpi/lib/openmpi/mca_ess_pmi.so
libtool: install: /usr/bin/install -c .libs/mca_ess_pmi.lai /opt/openmpi/lib/openmpi/mca_ess_pmi.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/orte/mca/ess/pmi'
make[2]: Leaving directory '/scratch/build/orte/mca/ess/pmi'
Making install in mca/ess/singleton
make[2]: Entering directory '/scratch/build/orte/mca/ess/singleton'
  CC       ess_singleton_component.lo
  CC       ess_singleton_module.lo
  CCLD     mca_ess_singleton.la
make[3]: Entering directory '/scratch/build/orte/mca/ess/singleton'
make[3]: Nothing to be done for 'install-exec-am'.
 ---> Removed intermediate container a94653e9edda
 ---> 1e11c37ad767
Step 12/18 : RUN wget https://apt.repos.intel.com/intel-gpg-keys/GPG-PUB-KEY-INTEL-SW-PRODUCTS-2023.PUB &&     apt-key add GPG-PUB-KEY-INTEL-SW-PRODUCTS-2023.PUB &&     echo "deb https://apt.repos.intel.com/oneapi all main" | tee /etc/apt/sources.list.d/oneAPI.list &&     apt-get update -o Dir::Etc::sourcelist="sources.list.d/oneAPI.list" -o APT::Get::List-Cleanup="0" &&     apt-get install -y intel-oneapi-compiler-dpcpp-cpp-${DPCPP_VERSION} &&     apt-get clean &&     rm -rf /var/lib/apt/lists/*
 ---> Running in 328177143f95
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_ess_singleton.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_ess_singleton.la'
libtool: install: (cd /scratch/build/orte/mca/ess/singleton; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_ess_singleton.la -rpath /opt/openmpi/lib/openmpi ess_singleton_component.lo ess_singleton_module.lo ../../../../orte/libopen-rte.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_ess_singleton.soT /opt/openmpi/lib/openmpi/mca_ess_singleton.so
libtool: install: /usr/bin/install -c .libs/mca_ess_singleton.lai /opt/openmpi/lib/openmpi/mca_ess_singleton.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/orte/mca/ess/singleton'
make[2]: Leaving directory '/scratch/build/orte/mca/ess/singleton'
Making install in mca/ess/tool
make[2]: Entering directory '/scratch/build/orte/mca/ess/tool'
  CC       ess_tool_component.lo
  CC       ess_tool_module.lo
  CCLD     mca_ess_tool.la
make[3]: Entering directory '/scratch/build/orte/mca/ess/tool'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_ess_tool.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_ess_tool.la'
libtool: install: (cd /scratch/build/orte/mca/ess/tool; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_ess_tool.la -rpath /opt/openmpi/lib/openmpi ess_tool_component.lo ess_tool_module.lo ../../../../orte/libopen-rte.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_ess_tool.soT /opt/openmpi/lib/openmpi/mca_ess_tool.so
libtool: install: /usr/bin/install -c .libs/mca_ess_tool.lai /opt/openmpi/lib/openmpi/mca_ess_tool.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/orte/mca/ess/tool'
make[2]: Leaving directory '/scratch/build/orte/mca/ess/tool'
Making install in mca/ess/slurm
make[2]: Entering directory '/scratch/build/orte/mca/ess/slurm'
  CC       ess_slurm_component.lo
  CC       ess_slurm_module.lo
  CCLD     mca_ess_slurm.la
make[3]: Entering directory '/scratch/build/orte/mca/ess/slurm'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_ess_slurm.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_ess_slurm.la'
libtool: install: (cd /scratch/build/orte/mca/ess/slurm; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_ess_slurm.la -rpath /opt/openmpi/lib/openmpi ess_slurm_component.lo ess_slurm_module.lo ../../../../orte/libopen-rte.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_ess_slurm.soT /opt/openmpi/lib/openmpi/mca_ess_slurm.so
libtool: install: /usr/bin/install -c .libs/mca_ess_slurm.lai /opt/openmpi/lib/openmpi/mca_ess_slurm.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/orte/mca/ess/slurm'
make[2]: Leaving directory '/scratch/build/orte/mca/ess/slurm'
Making install in mca/filem/raw
make[2]: Entering directory '/scratch/build/orte/mca/filem/raw'
  CC       filem_raw_component.lo
  CC       filem_raw_module.lo
  CCLD     mca_filem_raw.la
make[3]: Entering directory '/scratch/build/orte/mca/filem/raw'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../../../../openmpi/orte/mca/filem/raw/help-orte-filem-raw.txt '/opt/openmpi/share/openmpi'
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_filem_raw.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_filem_raw.la'
libtool: install: (cd /scratch/build/orte/mca/filem/raw; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_filem_raw.la -rpath /opt/openmpi/lib/openmpi filem_raw_component.lo filem_raw_module.lo ../../../../orte/libopen-rte.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_filem_raw.soT /opt/openmpi/lib/openmpi/mca_filem_raw.so
libtool: install: /usr/bin/install -c .libs/mca_filem_raw.lai /opt/openmpi/lib/openmpi/mca_filem_raw.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/orte/mca/filem/raw'
make[2]: Leaving directory '/scratch/build/orte/mca/filem/raw'
Making install in mca/grpcomm/direct
make[2]: Entering directory '/scratch/build/orte/mca/grpcomm/direct'
  CC       grpcomm_direct.lo
  CC       grpcomm_direct_component.lo
--2024-03-21 19:45:22--  https://apt.repos.intel.com/intel-gpg-keys/GPG-PUB-KEY-INTEL-SW-PRODUCTS-2023.PUB
Resolving apt.repos.intel.com (apt.repos.intel.com)... 23.33.34.115, 2600:1404:ec00:1195::a87, 2600:1404:ec00:118d::a87
Connecting to apt.repos.intel.com (apt.repos.intel.com)|23.33.34.115|:443... connected.
HTTP request sent, awaiting response...   CCLD     mca_grpcomm_direct.la
200 OK
Length: 4738 (4.6K) [application/vnd.exstream-package]
Saving to: 'GPG-PUB-KEY-INTEL-SW-PRODUCTS-2023.PUB'

     0K ....                                                  100% 58.8M=0s

2024-03-21 19:45:23 (58.8 MB/s) - 'GPG-PUB-KEY-INTEL-SW-PRODUCTS-2023.PUB' saved [4738/4738]

make[3]: Entering directory '/scratch/build/orte/mca/grpcomm/direct'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_grpcomm_direct.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_grpcomm_direct.la'
libtool: install: (cd /scratch/build/orte/mca/grpcomm/direct; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_grpcomm_direct.la -rpath /opt/openmpi/lib/openmpi grpcomm_direct.lo grpcomm_direct_component.lo ../../../../orte/libopen-rte.la -lrt -lm -lutil )
Warning: apt-key output should not be parsed (stdout is not a terminal)
libtool: install: /usr/bin/install -c .libs/mca_grpcomm_direct.soT /opt/openmpi/lib/openmpi/mca_grpcomm_direct.so
libtool: install: /usr/bin/install -c .libs/mca_grpcomm_direct.lai /opt/openmpi/lib/openmpi/mca_grpcomm_direct.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/orte/mca/grpcomm/direct'
make[2]: Leaving directory '/scratch/build/orte/mca/grpcomm/direct'
Making install in mca/iof/hnp
make[2]: Entering directory '/scratch/build/orte/mca/iof/hnp'
  CC       iof_hnp.lo
  CC       iof_hnp_component.lo
  CC       iof_hnp_read.lo
  CC       iof_hnp_send.lo
  CC       iof_hnp_receive.lo
  CCLD     mca_iof_hnp.la
make[3]: Entering directory '/scratch/build/orte/mca/iof/hnp'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_iof_hnp.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_iof_hnp.la'
libtool: install: (cd /scratch/build/orte/mca/iof/hnp; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_iof_hnp.la -rpath /opt/openmpi/lib/openmpi iof_hnp.lo iof_hnp_component.lo iof_hnp_read.lo iof_hnp_send.lo iof_hnp_receive.lo ../../../../orte/libopen-rte.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_iof_hnp.soT /opt/openmpi/lib/openmpi/mca_iof_hnp.so
libtool: install: /usr/bin/install -c .libs/mca_iof_hnp.lai /opt/openmpi/lib/openmpi/mca_iof_hnp.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/orte/mca/iof/hnp'
make[2]: Leaving directory '/scratch/build/orte/mca/iof/hnp'
Making install in mca/iof/orted
make[2]: Entering directory '/scratch/build/orte/mca/iof/orted'
  CC       iof_orted.lo
  CC       iof_orted_component.lo
  CC       iof_orted_read.lo
  CC       iof_orted_receive.lo
  CCLD     mca_iof_orted.la
make[3]: Entering directory '/scratch/build/orte/mca/iof/orted'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_iof_orted.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_iof_orted.la'
libtool: install: (cd /scratch/build/orte/mca/iof/orted; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_iof_orted.la -rpath /opt/openmpi/lib/openmpi iof_orted.lo iof_orted_component.lo iof_orted_read.lo iof_orted_receive.lo ../../../../orte/libopen-rte.la -lrt -lm -lutil )
OK
deb https://apt.repos.intel.com/oneapi all main
libtool: install: /usr/bin/install -c .libs/mca_iof_orted.soT /opt/openmpi/lib/openmpi/mca_iof_orted.so
libtool: install: /usr/bin/install -c .libs/mca_iof_orted.lai /opt/openmpi/lib/openmpi/mca_iof_orted.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/orte/mca/iof/orted'
make[2]: Leaving directory '/scratch/build/orte/mca/iof/orted'
Making install in mca/iof/tool
make[2]: Entering directory '/scratch/build/orte/mca/iof/tool'
  CC       iof_tool.lo
  CC       iof_tool_component.lo
  CC       iof_tool_receive.lo
  CCLD     mca_iof_tool.la
make[3]: Entering directory '/scratch/build/orte/mca/iof/tool'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_iof_tool.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_iof_tool.la'
libtool: install: (cd /scratch/build/orte/mca/iof/tool; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_iof_tool.la -rpath /opt/openmpi/lib/openmpi iof_tool.lo iof_tool_component.lo iof_tool_receive.lo ../../../../orte/libopen-rte.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_iof_tool.soT /opt/openmpi/lib/openmpi/mca_iof_tool.so
libtool: install: /usr/bin/install -c .libs/mca_iof_tool.lai /opt/openmpi/lib/openmpi/mca_iof_tool.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/orte/mca/iof/tool'
make[2]: Leaving directory '/scratch/build/orte/mca/iof/tool'
Making install in mca/odls/default
make[2]: Entering directory '/scratch/build/orte/mca/odls/default'
  CC       odls_default_component.lo
  CC       odls_default_module.lo
  CCLD     mca_odls_default.la
make[3]: Entering directory '/scratch/build/orte/mca/odls/default'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../../../../openmpi/orte/mca/odls/default/help-orte-odls-default.txt '/opt/openmpi/share/openmpi'
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_odls_default.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_odls_default.la'
libtool: install: (cd /scratch/build/orte/mca/odls/default; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_odls_default.la -rpath /opt/openmpi/lib/openmpi odls_default_component.lo odls_default_module.lo ../../../../orte/libopen-rte.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_odls_default.soT /opt/openmpi/lib/openmpi/mca_odls_default.so
libtool: install: /usr/bin/install -c .libs/mca_odls_default.lai /opt/openmpi/lib/openmpi/mca_odls_default.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/orte/mca/odls/default'
make[2]: Leaving directory '/scratch/build/orte/mca/odls/default'
Making install in mca/odls/pspawn
make[2]: Entering directory '/scratch/build/orte/mca/odls/pspawn'
  CC       odls_pspawn_component.lo
  CC       odls_pspawn.lo
  CCLD     mca_odls_pspawn.la
make[3]: Entering directory '/scratch/build/orte/mca/odls/pspawn'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../../../../openmpi/orte/mca/odls/pspawn/help-orte-odls-pspawn.txt '/opt/openmpi/share/openmpi'
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_odls_pspawn.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_odls_pspawn.la'
libtool: install: (cd /scratch/build/orte/mca/odls/pspawn; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_odls_pspawn.la -rpath /opt/openmpi/lib/openmpi odls_pspawn_component.lo odls_pspawn.lo ../../../../orte/libopen-rte.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_odls_pspawn.soT /opt/openmpi/lib/openmpi/mca_odls_pspawn.so
libtool: install: /usr/bin/install -c .libs/mca_odls_pspawn.lai /opt/openmpi/lib/openmpi/mca_odls_pspawn.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/orte/mca/odls/pspawn'
make[2]: Leaving directory '/scratch/build/orte/mca/odls/pspawn'
Making install in mca/oob/tcp
make[2]: Entering directory '/scratch/build/orte/mca/oob/tcp'
  CC       oob_tcp_component.lo
  CC       oob_tcp.lo
  CC       oob_tcp_listener.lo
  CC       oob_tcp_common.lo
  CC       oob_tcp_connection.lo
  CC       oob_tcp_sendrecv.lo
  CCLD     mca_oob_tcp.la
make[3]: Entering directory '/scratch/build/orte/mca/oob/tcp'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../../../../openmpi/orte/mca/oob/tcp/help-oob-tcp.txt '/opt/openmpi/share/openmpi'
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_oob_tcp.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_oob_tcp.la'
libtool: install: (cd /scratch/build/orte/mca/oob/tcp; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_oob_tcp.la -rpath /opt/openmpi/lib/openmpi oob_tcp_component.lo oob_tcp.lo oob_tcp_listener.lo oob_tcp_common.lo oob_tcp_connection.lo oob_tcp_sendrecv.lo ../../../../orte/libopen-rte.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_oob_tcp.soT /opt/openmpi/lib/openmpi/mca_oob_tcp.so
libtool: install: /usr/bin/install -c .libs/mca_oob_tcp.lai /opt/openmpi/lib/openmpi/mca_oob_tcp.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/orte/mca/oob/tcp'
make[2]: Leaving directory '/scratch/build/orte/mca/oob/tcp'
Making install in mca/plm/isolated
make[2]: Entering directory '/scratch/build/orte/mca/plm/isolated'
  CC       plm_isolated_component.lo
  CC       plm_isolated.lo
Get:1 https://apt.repos.intel.com/oneapi all InRelease [4455 B]
Get:2 https://apt.repos.intel.com/oneapi all/main amd64 Packages [463 kB]
  CCLD     mca_plm_isolated.la
Get:3 https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64  InRelease [1581 B]
make[3]: Entering directory '/scratch/build/orte/mca/plm/isolated'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_plm_isolated.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_plm_isolated.la'
libtool: install: (cd /scratch/build/orte/mca/plm/isolated; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_plm_isolated.la -rpath /opt/openmpi/lib/openmpi plm_isolated_component.lo plm_isolated.lo ../../../../orte/libopen-rte.la -lrt -lm -lutil )
Get:4 https://apt.repos.intel.com/oneapi all/main all Packages [133 kB]
libtool: install: /usr/bin/install -c .libs/mca_plm_isolated.soT /opt/openmpi/lib/openmpi/mca_plm_isolated.so
libtool: install: /usr/bin/install -c .libs/mca_plm_isolated.lai /opt/openmpi/lib/openmpi/mca_plm_isolated.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/orte/mca/plm/isolated'
make[2]: Leaving directory '/scratch/build/orte/mca/plm/isolated'
Making install in mca/plm/rsh
make[2]: Entering directory '/scratch/build/orte/mca/plm/rsh'
  CC       plm_rsh_component.lo
  CC       plm_rsh_module.lo
Get:5 https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64  Packages [1461 kB]
Fetched 2063 kB in 5s (389 kB/s)
Reading package lists...  CCLD     mca_plm_rsh.la
make[3]: Entering directory '/scratch/build/orte/mca/plm/rsh'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../../../../openmpi/orte/mca/plm/rsh/help-plm-rsh.txt '/opt/openmpi/share/openmpi'

 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_plm_rsh.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_plm_rsh.la'
libtool: install: (cd /scratch/build/orte/mca/plm/rsh; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_plm_rsh.la -rpath /opt/openmpi/lib/openmpi plm_rsh_component.lo plm_rsh_module.lo ../../../../orte/libopen-rte.la -lrt -lm -lutil )
Reading package lists...libtool: install: /usr/bin/install -c .libs/mca_plm_rsh.soT /opt/openmpi/lib/openmpi/mca_plm_rsh.so
libtool: install: /usr/bin/install -c .libs/mca_plm_rsh.lai /opt/openmpi/lib/openmpi/mca_plm_rsh.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/orte/mca/plm/rsh'
make[2]: Leaving directory '/scratch/build/orte/mca/plm/rsh'
Making install in mca/plm/slurm
make[2]: Entering directory '/scratch/build/orte/mca/plm/slurm'
  CC       plm_slurm_component.lo
  CC       plm_slurm_module.lo

Building dependency tree...
Reading state information...
The following additional packages will be installed:
  intel-oneapi-common-licensing-2023.0.0
  intel-oneapi-common-licensing-2023.1.0
  intel-oneapi-common-licensing-2023.2.0 intel-oneapi-common-licensing-2024.0
  intel-oneapi-common-oneapi-vars-2024.0 intel-oneapi-common-vars
  intel-oneapi-compiler-cpp-eclipse-cfg
  intel-oneapi-compiler-cpp-eclipse-cfg-2024.0
  intel-oneapi-compiler-dpcpp-cpp-common-2023.0.0
  intel-oneapi-compiler-dpcpp-cpp-runtime-2023.0.0
  intel-oneapi-compiler-dpcpp-eclipse-cfg
  intel-oneapi-compiler-dpcpp-eclipse-cfg-2024.0
  intel-oneapi-compiler-shared-2023.0.0
  intel-oneapi-compiler-shared-common-2023.0.0
  intel-oneapi-compiler-shared-runtime-2023.0.0 intel-oneapi-condaindex
  intel-oneapi-dev-utilities-2021.8.0 intel-oneapi-dev-utilities-eclipse-cfg
  intel-oneapi-dpcpp-cpp-2023.0.0 intel-oneapi-dpcpp-debugger-2023.0.0
  intel-oneapi-dpcpp-debugger-eclipse-cfg
  intel-oneapi-icc-eclipse-plugin-cpp-2023.0.0
  intel-oneapi-libdpstd-devel-2022.0.0 intel-oneapi-openmp-2023.0.0
  intel-oneapi-openmp-common-2023.0.0 intel-oneapi-tbb-2021.8.0
  intel-oneapi-tbb-common-2021.8.0 intel-oneapi-tbb-common-devel-2021.8.0
  intel-oneapi-tbb-devel-2021.8.0
  CCLD     mca_plm_slurm.la
The following NEW packages will be installed:
  intel-oneapi-common-licensing-2023.0.0
  intel-oneapi-common-licensing-2023.1.0
  intel-oneapi-common-licensing-2023.2.0 intel-oneapi-common-licensing-2024.0
  intel-oneapi-common-oneapi-vars-2024.0 intel-oneapi-common-vars
  intel-oneapi-compiler-cpp-eclipse-cfg
  intel-oneapi-compiler-cpp-eclipse-cfg-2024.0
  intel-oneapi-compiler-dpcpp-cpp-2023.0.0
  intel-oneapi-compiler-dpcpp-cpp-common-2023.0.0
  intel-oneapi-compiler-dpcpp-cpp-runtime-2023.0.0
  intel-oneapi-compiler-dpcpp-eclipse-cfg
  intel-oneapi-compiler-dpcpp-eclipse-cfg-2024.0
  intel-oneapi-compiler-shared-2023.0.0
  intel-oneapi-compiler-shared-common-2023.0.0
  intel-oneapi-compiler-shared-runtime-2023.0.0 intel-oneapi-condaindex
  intel-oneapi-dev-utilities-2021.8.0 intel-oneapi-dev-utilities-eclipse-cfg
  intel-oneapi-dpcpp-cpp-2023.0.0 intel-oneapi-dpcpp-debugger-2023.0.0
  intel-oneapi-dpcpp-debugger-eclipse-cfg
  intel-oneapi-icc-eclipse-plugin-cpp-2023.0.0
  intel-oneapi-libdpstd-devel-2022.0.0 intel-oneapi-openmp-2023.0.0
  intel-oneapi-openmp-common-2023.0.0 intel-oneapi-tbb-2021.8.0
  intel-oneapi-tbb-common-2021.8.0 intel-oneapi-tbb-common-devel-2021.8.0
  intel-oneapi-tbb-devel-2021.8.0
make[3]: Entering directory '/scratch/build/orte/mca/plm/slurm'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../../../../openmpi/orte/mca/plm/slurm/help-plm-slurm.txt '/opt/openmpi/share/openmpi'
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_plm_slurm.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_plm_slurm.la'
libtool: install: (cd /scratch/build/orte/mca/plm/slurm; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_plm_slurm.la -rpath /opt/openmpi/lib/openmpi plm_slurm_component.lo plm_slurm_module.lo ../../../../orte/libopen-rte.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_plm_slurm.soT /opt/openmpi/lib/openmpi/mca_plm_slurm.so
libtool: install: /usr/bin/install -c .libs/mca_plm_slurm.lai /opt/openmpi/lib/openmpi/mca_plm_slurm.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/orte/mca/plm/slurm'
make[2]: Leaving directory '/scratch/build/orte/mca/plm/slurm'
Making install in mca/ras/simulator
make[2]: Entering directory '/scratch/build/orte/mca/ras/simulator'
  CC       ras_sim_component.lo
  CC       ras_sim_module.lo
  CCLD     mca_ras_simulator.la
make[3]: Entering directory '/scratch/build/orte/mca/ras/simulator'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../../../../openmpi/orte/mca/ras/simulator/help-ras-simulator.txt '/opt/openmpi/share/openmpi'
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_ras_simulator.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_ras_simulator.la'
libtool: install: (cd /scratch/build/orte/mca/ras/simulator; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_ras_simulator.la -rpath /opt/openmpi/lib/openmpi ras_sim_component.lo ras_sim_module.lo ../../../../orte/libopen-rte.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_ras_simulator.soT /opt/openmpi/lib/openmpi/mca_ras_simulator.so
libtool: install: /usr/bin/install -c .libs/mca_ras_simulator.lai /opt/openmpi/lib/openmpi/mca_ras_simulator.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/orte/mca/ras/simulator'
make[2]: Leaving directory '/scratch/build/orte/mca/ras/simulator'
Making install in mca/ras/slurm
0 upgraded, 30 newly installed, 0 to remove and 2 not upgraded.
Need to get 1215 MB of archives.
After this operation, 4742 MB of additional disk space will be used.
Get:1 https://apt.repos.intel.com/oneapi all/main all intel-oneapi-common-licensing-2023.0.0 all 2023.0.0-25325 [30.5 kB]
make[2]: Entering directory '/scratch/build/orte/mca/ras/slurm'
  CC       ras_slurm_component.lo
  CC       ras_slurm_module.lo
  CCLD     mca_ras_slurm.la
make[3]: Entering directory '/scratch/build/orte/mca/ras/slurm'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../../../../openmpi/orte/mca/ras/slurm/help-ras-slurm.txt '/opt/openmpi/share/openmpi'
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_ras_slurm.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_ras_slurm.la'
libtool: install: (cd /scratch/build/orte/mca/ras/slurm; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_ras_slurm.la -rpath /opt/openmpi/lib/openmpi ras_slurm_component.lo ras_slurm_module.lo ../../../../orte/libopen-rte.la -lrt -lm -lutil )
Get:2 https://apt.repos.intel.com/oneapi all/main all intel-oneapi-common-licensing-2023.1.0 all 2023.1.0-43473 [30.5 kB]
libtool: install: /usr/bin/install -c .libs/mca_ras_slurm.soT /opt/openmpi/lib/openmpi/mca_ras_slurm.so
libtool: install: /usr/bin/install -c .libs/mca_ras_slurm.lai /opt/openmpi/lib/openmpi/mca_ras_slurm.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/orte/mca/ras/slurm'
make[2]: Leaving directory '/scratch/build/orte/mca/ras/slurm'
Making install in mca/regx/fwd
make[2]: Entering directory '/scratch/build/orte/mca/regx/fwd'
  CC       regx_fwd_component.lo
  CC       regx_fwd.lo
  CCLD     mca_regx_fwd.la
Get:3 https://apt.repos.intel.com/oneapi all/main all intel-oneapi-common-licensing-2023.2.0 all 2023.2.0-49462 [30.4 kB]
Get:4 https://apt.repos.intel.com/oneapi all/main all intel-oneapi-common-licensing-2024.0 all 2024.0.0-49406 [30.7 kB]
make[3]: Entering directory '/scratch/build/orte/mca/regx/fwd'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_regx_fwd.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_regx_fwd.la'
libtool: install: (cd /scratch/build/orte/mca/regx/fwd; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_regx_fwd.la -rpath /opt/openmpi/lib/openmpi regx_fwd_component.lo regx_fwd.lo ../../../../orte/libopen-rte.la -lrt -lm -lutil )
Get:5 https://apt.repos.intel.com/oneapi all/main all intel-oneapi-common-oneapi-vars-2024.0 all 2024.0.0-49406 [10.4 kB]
Get:6 https://apt.repos.intel.com/oneapi all/main all intel-oneapi-common-vars all 2024.0.0-49406 [12.2 kB]
Get:7 https://apt.repos.intel.com/oneapi all/main all intel-oneapi-compiler-cpp-eclipse-cfg-2024.0 all 2024.0.2-49895 [2852 B]
libtool: install: /usr/bin/install -c .libs/mca_regx_fwd.soT /opt/openmpi/lib/openmpi/mca_regx_fwd.so
libtool: install: /usr/bin/install -c .libs/mca_regx_fwd.lai /opt/openmpi/lib/openmpi/mca_regx_fwd.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/orte/mca/regx/fwd'
make[2]: Leaving directory '/scratch/build/orte/mca/regx/fwd'
Making install in mca/regx/naive
make[2]: Entering directory '/scratch/build/orte/mca/regx/naive'
  CC       regx_naive_component.lo
  CC       regx_naive.lo
Get:8 https://apt.repos.intel.com/oneapi all/main all intel-oneapi-compiler-cpp-eclipse-cfg all 2024.0.2-49895 [1824 B]
Get:9 https://apt.repos.intel.com/oneapi all/main all intel-oneapi-compiler-dpcpp-eclipse-cfg-2024.0 all 2024.0.2-49895 [2476 B]
  CCLD     mca_regx_naive.la
Get:10 https://apt.repos.intel.com/oneapi all/main all intel-oneapi-compiler-dpcpp-eclipse-cfg all 2024.0.2-49895 [1808 B]
Get:11 https://apt.repos.intel.com/oneapi all/main all intel-oneapi-icc-eclipse-plugin-cpp-2023.0.0 all 2023.0.0-25370 [1976 B]
make[3]: Entering directory '/scratch/build/orte/mca/regx/naive'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_regx_naive.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_regx_naive.la'
libtool: install: (cd /scratch/build/orte/mca/regx/naive; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_regx_naive.la -rpath /opt/openmpi/lib/openmpi regx_naive_component.lo regx_naive.lo ../../../../orte/libopen-rte.la -lrt -lm -lutil )
Get:12 https://apt.repos.intel.com/oneapi all/main all intel-oneapi-compiler-dpcpp-cpp-common-2023.0.0 all 2023.0.0-25370 [1858 kB]
libtool: install: /usr/bin/install -c .libs/mca_regx_naive.soT /opt/openmpi/lib/openmpi/mca_regx_naive.so
libtool: install: /usr/bin/install -c .libs/mca_regx_naive.lai /opt/openmpi/lib/openmpi/mca_regx_naive.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/orte/mca/regx/naive'
make[2]: Leaving directory '/scratch/build/orte/mca/regx/naive'
Making install in mca/regx/reverse
make[2]: Entering directory '/scratch/build/orte/mca/regx/reverse'
  CC       regx_reverse_component.lo
  CC       regx_reverse.lo
Get:13 https://apt.repos.intel.com/oneapi all/main amd64 intel-oneapi-condaindex amd64 2023.2.0-49417 [676 kB]
Get:14 https://apt.repos.intel.com/oneapi all/main all intel-oneapi-openmp-common-2023.0.0 all 2023.0.0-25370 [19.2 kB]
  CCLD     mca_regx_reverse.la
make[3]: Entering directory '/scratch/build/orte/mca/regx/reverse'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_regx_reverse.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_regx_reverse.la'
libtool: install: (cd /scratch/build/orte/mca/regx/reverse; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_regx_reverse.la -rpath /opt/openmpi/lib/openmpi regx_reverse_component.lo regx_reverse.lo ../../../../orte/libopen-rte.la -lrt -lm -lutil )
Get:15 https://apt.repos.intel.com/oneapi all/main amd64 intel-oneapi-openmp-2023.0.0 amd64 2023.0.0-25370 [221 MB]
libtool: install: /usr/bin/install -c .libs/mca_regx_reverse.soT /opt/openmpi/lib/openmpi/mca_regx_reverse.so
libtool: install: /usr/bin/install -c .libs/mca_regx_reverse.lai /opt/openmpi/lib/openmpi/mca_regx_reverse.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/orte/mca/regx/reverse'
make[2]: Leaving directory '/scratch/build/orte/mca/regx/reverse'
Making install in mca/rmaps/mindist
make[2]: Entering directory '/scratch/build/orte/mca/rmaps/mindist'
  CC       rmaps_mindist_module.lo
  CC       rmaps_mindist_component.lo
  CCLD     mca_rmaps_mindist.la
make[3]: Entering directory '/scratch/build/orte/mca/rmaps/mindist'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../../../../openmpi/orte/mca/rmaps/mindist/help-orte-rmaps-md.txt '/opt/openmpi/share/openmpi'
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_rmaps_mindist.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_rmaps_mindist.la'
libtool: install: (cd /scratch/build/orte/mca/rmaps/mindist; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_rmaps_mindist.la -rpath /opt/openmpi/lib/openmpi rmaps_mindist_module.lo rmaps_mindist_component.lo ../../../../orte/libopen-rte.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_rmaps_mindist.soT /opt/openmpi/lib/openmpi/mca_rmaps_mindist.so
libtool: install: /usr/bin/install -c .libs/mca_rmaps_mindist.lai /opt/openmpi/lib/openmpi/mca_rmaps_mindist.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/orte/mca/rmaps/mindist'
make[2]: Leaving directory '/scratch/build/orte/mca/rmaps/mindist'
Making install in mca/rmaps/ppr
make[2]: Entering directory '/scratch/build/orte/mca/rmaps/ppr'
  CC       rmaps_ppr.lo
  CC       rmaps_ppr_component.lo
  CCLD     mca_rmaps_ppr.la
make[3]: Entering directory '/scratch/build/orte/mca/rmaps/ppr'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../../../../openmpi/orte/mca/rmaps/ppr/help-orte-rmaps-ppr.txt '/opt/openmpi/share/openmpi'
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_rmaps_ppr.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_rmaps_ppr.la'
libtool: install: (cd /scratch/build/orte/mca/rmaps/ppr; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_rmaps_ppr.la -rpath /opt/openmpi/lib/openmpi rmaps_ppr.lo rmaps_ppr_component.lo ../../../../orte/libopen-rte.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_rmaps_ppr.soT /opt/openmpi/lib/openmpi/mca_rmaps_ppr.so
libtool: install: /usr/bin/install -c .libs/mca_rmaps_ppr.lai /opt/openmpi/lib/openmpi/mca_rmaps_ppr.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/orte/mca/rmaps/ppr'
make[2]: Leaving directory '/scratch/build/orte/mca/rmaps/ppr'
Making install in mca/rmaps/rank_file
make[2]: Entering directory '/scratch/build/orte/mca/rmaps/rank_file'
  CC       rmaps_rank_file.lo
  CC       rmaps_rank_file_component.lo
  CC       rmaps_rank_file_lex.lo
  CCLD     mca_rmaps_rank_file.la
make[3]: Entering directory '/scratch/build/orte/mca/rmaps/rank_file'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../../../../openmpi/orte/mca/rmaps/rank_file/help-rmaps_rank_file.txt '/opt/openmpi/share/openmpi'
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_rmaps_rank_file.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_rmaps_rank_file.la'
libtool: install: (cd /scratch/build/orte/mca/rmaps/rank_file; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_rmaps_rank_file.la -rpath /opt/openmpi/lib/openmpi rmaps_rank_file.lo rmaps_rank_file_component.lo rmaps_rank_file_lex.lo ../../../../orte/libopen-rte.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_rmaps_rank_file.soT /opt/openmpi/lib/openmpi/mca_rmaps_rank_file.so
libtool: install: /usr/bin/install -c .libs/mca_rmaps_rank_file.lai /opt/openmpi/lib/openmpi/mca_rmaps_rank_file.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/orte/mca/rmaps/rank_file'
make[2]: Leaving directory '/scratch/build/orte/mca/rmaps/rank_file'
Making install in mca/rmaps/resilient
make[2]: Entering directory '/scratch/build/orte/mca/rmaps/resilient'
  CC       rmaps_resilient.lo
  CC       rmaps_resilient_component.lo
  CCLD     mca_rmaps_resilient.la
make[3]: Entering directory '/scratch/build/orte/mca/rmaps/resilient'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../../../../openmpi/orte/mca/rmaps/resilient/help-orte-rmaps-resilient.txt '/opt/openmpi/share/openmpi'
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_rmaps_resilient.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_rmaps_resilient.la'
libtool: install: (cd /scratch/build/orte/mca/rmaps/resilient; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_rmaps_resilient.la -rpath /opt/openmpi/lib/openmpi rmaps_resilient.lo rmaps_resilient_component.lo ../../../../orte/libopen-rte.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_rmaps_resilient.soT /opt/openmpi/lib/openmpi/mca_rmaps_resilient.so
libtool: install: /usr/bin/install -c .libs/mca_rmaps_resilient.lai /opt/openmpi/lib/openmpi/mca_rmaps_resilient.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/orte/mca/rmaps/resilient'
make[2]: Leaving directory '/scratch/build/orte/mca/rmaps/resilient'
Making install in mca/rmaps/round_robin
make[2]: Entering directory '/scratch/build/orte/mca/rmaps/round_robin'
  CC       rmaps_rr.lo
  CC       rmaps_rr_component.lo
  CC       rmaps_rr_mappers.lo
  CC       rmaps_rr_assign.lo
  CCLD     mca_rmaps_round_robin.la
make[3]: Entering directory '/scratch/build/orte/mca/rmaps/round_robin'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../../../../openmpi/orte/mca/rmaps/round_robin/help-orte-rmaps-rr.txt '/opt/openmpi/share/openmpi'
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_rmaps_round_robin.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_rmaps_round_robin.la'
libtool: install: (cd /scratch/build/orte/mca/rmaps/round_robin; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_rmaps_round_robin.la -rpath /opt/openmpi/lib/openmpi rmaps_rr.lo rmaps_rr_component.lo rmaps_rr_mappers.lo rmaps_rr_assign.lo ../../../../orte/libopen-rte.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_rmaps_round_robin.soT /opt/openmpi/lib/openmpi/mca_rmaps_round_robin.so
libtool: install: /usr/bin/install -c .libs/mca_rmaps_round_robin.lai /opt/openmpi/lib/openmpi/mca_rmaps_round_robin.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/orte/mca/rmaps/round_robin'
make[2]: Leaving directory '/scratch/build/orte/mca/rmaps/round_robin'
Making install in mca/rmaps/seq
make[2]: Entering directory '/scratch/build/orte/mca/rmaps/seq'
  CC       rmaps_seq.lo
  CC       rmaps_seq_component.lo
  CCLD     mca_rmaps_seq.la
make[3]: Entering directory '/scratch/build/orte/mca/rmaps/seq'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../../../../openmpi/orte/mca/rmaps/seq/help-orte-rmaps-seq.txt '/opt/openmpi/share/openmpi'
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_rmaps_seq.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_rmaps_seq.la'
libtool: install: (cd /scratch/build/orte/mca/rmaps/seq; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_rmaps_seq.la -rpath /opt/openmpi/lib/openmpi rmaps_seq.lo rmaps_seq_component.lo ../../../../orte/libopen-rte.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_rmaps_seq.soT /opt/openmpi/lib/openmpi/mca_rmaps_seq.so
libtool: install: /usr/bin/install -c .libs/mca_rmaps_seq.lai /opt/openmpi/lib/openmpi/mca_rmaps_seq.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/orte/mca/rmaps/seq'
make[2]: Leaving directory '/scratch/build/orte/mca/rmaps/seq'
Making install in mca/rml/oob
make[2]: Entering directory '/scratch/build/orte/mca/rml/oob'
  CC       rml_oob_component.lo
  CC       rml_oob_send.lo
  CCLD     mca_rml_oob.la
make[3]: Entering directory '/scratch/build/orte/mca/rml/oob'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_rml_oob.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_rml_oob.la'
libtool: install: (cd /scratch/build/orte/mca/rml/oob; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_rml_oob.la -rpath /opt/openmpi/lib/openmpi rml_oob_component.lo rml_oob_send.lo ../../../../orte/libopen-rte.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_rml_oob.soT /opt/openmpi/lib/openmpi/mca_rml_oob.so
libtool: install: /usr/bin/install -c .libs/mca_rml_oob.lai /opt/openmpi/lib/openmpi/mca_rml_oob.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/orte/mca/rml/oob'
make[2]: Leaving directory '/scratch/build/orte/mca/rml/oob'
Making install in mca/routed/binomial
make[2]: Entering directory '/scratch/build/orte/mca/routed/binomial'
  CC       routed_binomial.lo
  CC       routed_binomial_component.lo
  CCLD     mca_routed_binomial.la
make[3]: Entering directory '/scratch/build/orte/mca/routed/binomial'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_routed_binomial.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_routed_binomial.la'
libtool: install: (cd /scratch/build/orte/mca/routed/binomial; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_routed_binomial.la -rpath /opt/openmpi/lib/openmpi routed_binomial.lo routed_binomial_component.lo ../../../../orte/libopen-rte.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_routed_binomial.soT /opt/openmpi/lib/openmpi/mca_routed_binomial.so
libtool: install: /usr/bin/install -c .libs/mca_routed_binomial.lai /opt/openmpi/lib/openmpi/mca_routed_binomial.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/orte/mca/routed/binomial'
make[2]: Leaving directory '/scratch/build/orte/mca/routed/binomial'
Making install in mca/routed/direct
make[2]: Entering directory '/scratch/build/orte/mca/routed/direct'
  CC       routed_direct.lo
  CC       routed_direct_component.lo
  CCLD     mca_routed_direct.la
make[3]: Entering directory '/scratch/build/orte/mca/routed/direct'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_routed_direct.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_routed_direct.la'
libtool: install: (cd /scratch/build/orte/mca/routed/direct; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_routed_direct.la -rpath /opt/openmpi/lib/openmpi routed_direct.lo routed_direct_component.lo ../../../../orte/libopen-rte.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_routed_direct.soT /opt/openmpi/lib/openmpi/mca_routed_direct.so
libtool: install: /usr/bin/install -c .libs/mca_routed_direct.lai /opt/openmpi/lib/openmpi/mca_routed_direct.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/orte/mca/routed/direct'
make[2]: Leaving directory '/scratch/build/orte/mca/routed/direct'
Making install in mca/routed/radix
make[2]: Entering directory '/scratch/build/orte/mca/routed/radix'
  CC       routed_radix.lo
  CC       routed_radix_component.lo
  CCLD     mca_routed_radix.la
make[3]: Entering directory '/scratch/build/orte/mca/routed/radix'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_routed_radix.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_routed_radix.la'
libtool: install: (cd /scratch/build/orte/mca/routed/radix; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_routed_radix.la -rpath /opt/openmpi/lib/openmpi routed_radix.lo routed_radix_component.lo ../../../../orte/libopen-rte.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_routed_radix.soT /opt/openmpi/lib/openmpi/mca_routed_radix.so
libtool: install: /usr/bin/install -c .libs/mca_routed_radix.lai /opt/openmpi/lib/openmpi/mca_routed_radix.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/orte/mca/routed/radix'
make[2]: Leaving directory '/scratch/build/orte/mca/routed/radix'
Making install in mca/rtc/hwloc
make[2]: Entering directory '/scratch/build/orte/mca/rtc/hwloc'
  CC       rtc_hwloc.lo
  CC       rtc_hwloc_component.lo
  CCLD     mca_rtc_hwloc.la
make[3]: Entering directory '/scratch/build/orte/mca/rtc/hwloc'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../../../../openmpi/orte/mca/rtc/hwloc/help-orte-rtc-hwloc.txt '/opt/openmpi/share/openmpi'
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_rtc_hwloc.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_rtc_hwloc.la'
libtool: install: (cd /scratch/build/orte/mca/rtc/hwloc; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_rtc_hwloc.la -rpath /opt/openmpi/lib/openmpi rtc_hwloc.lo rtc_hwloc_component.lo ../../../../orte/libopen-rte.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_rtc_hwloc.soT /opt/openmpi/lib/openmpi/mca_rtc_hwloc.so
libtool: install: /usr/bin/install -c .libs/mca_rtc_hwloc.lai /opt/openmpi/lib/openmpi/mca_rtc_hwloc.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/orte/mca/rtc/hwloc'
make[2]: Leaving directory '/scratch/build/orte/mca/rtc/hwloc'
Making install in mca/schizo/flux
make[2]: Entering directory '/scratch/build/orte/mca/schizo/flux'
  CC       schizo_flux_component.lo
  CC       schizo_flux.lo
  CCLD     mca_schizo_flux.la
make[3]: Entering directory '/scratch/build/orte/mca/schizo/flux'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_schizo_flux.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_schizo_flux.la'
libtool: install: (cd /scratch/build/orte/mca/schizo/flux; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_schizo_flux.la -rpath /opt/openmpi/lib/openmpi schizo_flux_component.lo schizo_flux.lo ../../../../orte/libopen-rte.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_schizo_flux.soT /opt/openmpi/lib/openmpi/mca_schizo_flux.so
libtool: install: /usr/bin/install -c .libs/mca_schizo_flux.lai /opt/openmpi/lib/openmpi/mca_schizo_flux.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/orte/mca/schizo/flux'
make[2]: Leaving directory '/scratch/build/orte/mca/schizo/flux'
Making install in mca/schizo/ompi
make[2]: Entering directory '/scratch/build/orte/mca/schizo/ompi'
  CC       schizo_ompi_component.lo
  CC       schizo_ompi.lo
  CCLD     mca_schizo_ompi.la
make[3]: Entering directory '/scratch/build/orte/mca/schizo/ompi'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_schizo_ompi.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_schizo_ompi.la'
libtool: install: (cd /scratch/build/orte/mca/schizo/ompi; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_schizo_ompi.la -rpath /opt/openmpi/lib/openmpi schizo_ompi_component.lo schizo_ompi.lo ../../../../orte/libopen-rte.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_schizo_ompi.soT /opt/openmpi/lib/openmpi/mca_schizo_ompi.so
libtool: install: /usr/bin/install -c .libs/mca_schizo_ompi.lai /opt/openmpi/lib/openmpi/mca_schizo_ompi.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/orte/mca/schizo/ompi'
make[2]: Leaving directory '/scratch/build/orte/mca/schizo/ompi'
Making install in mca/schizo/orte
make[2]: Entering directory '/scratch/build/orte/mca/schizo/orte'
  CC       schizo_orte_component.lo
  CC       schizo_orte.lo
  CCLD     mca_schizo_orte.la
make[3]: Entering directory '/scratch/build/orte/mca/schizo/orte'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_schizo_orte.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_schizo_orte.la'
libtool: install: (cd /scratch/build/orte/mca/schizo/orte; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_schizo_orte.la -rpath /opt/openmpi/lib/openmpi schizo_orte_component.lo schizo_orte.lo ../../../../orte/libopen-rte.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_schizo_orte.soT /opt/openmpi/lib/openmpi/mca_schizo_orte.so
libtool: install: /usr/bin/install -c .libs/mca_schizo_orte.lai /opt/openmpi/lib/openmpi/mca_schizo_orte.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/orte/mca/schizo/orte'
make[2]: Leaving directory '/scratch/build/orte/mca/schizo/orte'
Making install in mca/schizo/slurm
make[2]: Entering directory '/scratch/build/orte/mca/schizo/slurm'
  CC       schizo_slurm_component.lo
  CC       schizo_slurm.lo
  CCLD     mca_schizo_slurm.la
make[3]: Entering directory '/scratch/build/orte/mca/schizo/slurm'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_schizo_slurm.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_schizo_slurm.la'
libtool: install: (cd /scratch/build/orte/mca/schizo/slurm; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_schizo_slurm.la -rpath /opt/openmpi/lib/openmpi schizo_slurm_component.lo schizo_slurm.lo ../../../../orte/libopen-rte.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_schizo_slurm.soT /opt/openmpi/lib/openmpi/mca_schizo_slurm.so
libtool: install: /usr/bin/install -c .libs/mca_schizo_slurm.lai /opt/openmpi/lib/openmpi/mca_schizo_slurm.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/orte/mca/schizo/slurm'
make[2]: Leaving directory '/scratch/build/orte/mca/schizo/slurm'
Making install in mca/state/app
make[2]: Entering directory '/scratch/build/orte/mca/state/app'
  CC       state_app_component.lo
  CC       state_app.lo
  CCLD     mca_state_app.la
make[3]: Entering directory '/scratch/build/orte/mca/state/app'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_state_app.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_state_app.la'
libtool: install: (cd /scratch/build/orte/mca/state/app; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_state_app.la -rpath /opt/openmpi/lib/openmpi state_app_component.lo state_app.lo ../../../../orte/libopen-rte.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_state_app.soT /opt/openmpi/lib/openmpi/mca_state_app.so
libtool: install: /usr/bin/install -c .libs/mca_state_app.lai /opt/openmpi/lib/openmpi/mca_state_app.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/orte/mca/state/app'
make[2]: Leaving directory '/scratch/build/orte/mca/state/app'
Making install in mca/state/hnp
make[2]: Entering directory '/scratch/build/orte/mca/state/hnp'
  CC       state_hnp_component.lo
  CC       state_hnp.lo
  CCLD     mca_state_hnp.la
make[3]: Entering directory '/scratch/build/orte/mca/state/hnp'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_state_hnp.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_state_hnp.la'
libtool: install: (cd /scratch/build/orte/mca/state/hnp; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_state_hnp.la -rpath /opt/openmpi/lib/openmpi state_hnp_component.lo state_hnp.lo ../../../../orte/libopen-rte.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_state_hnp.soT /opt/openmpi/lib/openmpi/mca_state_hnp.so
libtool: install: /usr/bin/install -c .libs/mca_state_hnp.lai /opt/openmpi/lib/openmpi/mca_state_hnp.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/orte/mca/state/hnp'
make[2]: Leaving directory '/scratch/build/orte/mca/state/hnp'
Making install in mca/state/novm
make[2]: Entering directory '/scratch/build/orte/mca/state/novm'
  CC       state_novm_component.lo
  CC       state_novm.lo
  CCLD     mca_state_novm.la
make[3]: Entering directory '/scratch/build/orte/mca/state/novm'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_state_novm.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_state_novm.la'
libtool: install: (cd /scratch/build/orte/mca/state/novm; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_state_novm.la -rpath /opt/openmpi/lib/openmpi state_novm_component.lo state_novm.lo ../../../../orte/libopen-rte.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_state_novm.soT /opt/openmpi/lib/openmpi/mca_state_novm.so
libtool: install: /usr/bin/install -c .libs/mca_state_novm.lai /opt/openmpi/lib/openmpi/mca_state_novm.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/orte/mca/state/novm'
make[2]: Leaving directory '/scratch/build/orte/mca/state/novm'
Making install in mca/state/orted
make[2]: Entering directory '/scratch/build/orte/mca/state/orted'
  CC       state_orted_component.lo
  CC       state_orted.lo
  CCLD     mca_state_orted.la
make[3]: Entering directory '/scratch/build/orte/mca/state/orted'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_state_orted.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_state_orted.la'
libtool: install: (cd /scratch/build/orte/mca/state/orted; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_state_orted.la -rpath /opt/openmpi/lib/openmpi state_orted_component.lo state_orted.lo ../../../../orte/libopen-rte.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_state_orted.soT /opt/openmpi/lib/openmpi/mca_state_orted.so
libtool: install: /usr/bin/install -c .libs/mca_state_orted.lai /opt/openmpi/lib/openmpi/mca_state_orted.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/orte/mca/state/orted'
make[2]: Leaving directory '/scratch/build/orte/mca/state/orted'
Making install in mca/state/tool
make[2]: Entering directory '/scratch/build/orte/mca/state/tool'
  CC       state_tool_component.lo
  CC       state_tool.lo
  CCLD     mca_state_tool.la
make[3]: Entering directory '/scratch/build/orte/mca/state/tool'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_state_tool.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_state_tool.la'
libtool: install: (cd /scratch/build/orte/mca/state/tool; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_state_tool.la -rpath /opt/openmpi/lib/openmpi state_tool_component.lo state_tool.lo ../../../../orte/libopen-rte.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_state_tool.soT /opt/openmpi/lib/openmpi/mca_state_tool.so
libtool: install: /usr/bin/install -c .libs/mca_state_tool.lai /opt/openmpi/lib/openmpi/mca_state_tool.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/orte/mca/state/tool'
make[2]: Leaving directory '/scratch/build/orte/mca/state/tool'
Making install in tools/orte-clean
make[2]: Entering directory '/scratch/build/orte/tools/orte-clean'
  CC       orte-clean.o
  GENERATE orte-clean.1
  CCLD     orte-clean
make[3]: Entering directory '/scratch/build/orte/tools/orte-clean'
 /usr/bin/mkdir -p '/opt/openmpi/bin'
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/mkdir -p '/opt/openmpi/share/man/man1'
 /usr/bin/install -c -m 644 ../../../../openmpi/orte/tools/orte-clean/help-orte-clean.txt '/opt/openmpi/share/openmpi'
  /bin/bash ../../../libtool   --mode=install /usr/bin/install -c orte-clean '/opt/openmpi/bin'
 /usr/bin/install -c -m 644 orte-clean.1 '/opt/openmpi/share/man/man1'
libtool: install: /usr/bin/install -c .libs/orte-clean /opt/openmpi/bin/orte-clean
make[3]: Leaving directory '/scratch/build/orte/tools/orte-clean'
make[2]: Leaving directory '/scratch/build/orte/tools/orte-clean'
Making install in tools/orted
make[2]: Entering directory '/scratch/build/orte/tools/orted'
  CC       orted.o
  GENERATE orted.1
  CCLD     orted
make[3]: Entering directory '/scratch/build/orte/tools/orted'
 /usr/bin/mkdir -p '/opt/openmpi/bin'
 /usr/bin/mkdir -p '/opt/openmpi/share/man/man1'
  /bin/bash ../../../libtool   --mode=install /usr/bin/install -c orted '/opt/openmpi/bin'
 /usr/bin/install -c -m 644 orted.1 '/opt/openmpi/share/man/man1'
libtool: install: /usr/bin/install -c .libs/orted /opt/openmpi/bin/orted
make[3]: Leaving directory '/scratch/build/orte/tools/orted'
make[2]: Leaving directory '/scratch/build/orte/tools/orted'
Making install in tools/orterun
make[2]: Entering directory '/scratch/build/orte/tools/orterun'
  CC       main.o
  CC       orterun.o
  GENERATE orterun.1
  CCLD     orterun
make[3]: Entering directory '/scratch/build/orte/tools/orterun'
 /usr/bin/mkdir -p '/opt/openmpi/bin'
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/mkdir -p '/opt/openmpi/share/man/man1'
 /usr/bin/install -c -m 644 ../../../../openmpi/orte/tools/orterun/help-orterun.txt '/opt/openmpi/share/openmpi'
  /bin/bash ../../../libtool   --mode=install /usr/bin/install -c orterun '/opt/openmpi/bin'
 /usr/bin/install -c -m 644 orterun.1 '/opt/openmpi/share/man/man1'
libtool: install: /usr/bin/install -c .libs/orterun /opt/openmpi/bin/orterun
make[3]: Leaving directory '/scratch/build/orte/tools/orterun'
make[2]: Leaving directory '/scratch/build/orte/tools/orterun'
Making install in tools/wrappers
make[2]: Entering directory '/scratch/build/orte/tools/wrappers'
make[3]: Entering directory '/scratch/build/orte/tools/wrappers'
make  install-exec-hook
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/mkdir -p '/opt/openmpi/lib/pkgconfig'
 /usr/bin/install -c -m 644 ortecc-wrapper-data.txt '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 orte.pc '/opt/openmpi/lib/pkgconfig'
make[4]: Entering directory '/scratch/build/orte/tools/wrappers'
test -z "/opt/openmpi/bin" || /usr/bin/mkdir -p "/opt/openmpi/bin"
(cd /opt/openmpi/bin; rm -f ortecc; ln -s opal_wrapper ortecc)
make[4]: Leaving directory '/scratch/build/orte/tools/wrappers'
make[3]: Leaving directory '/scratch/build/orte/tools/wrappers'
make[2]: Leaving directory '/scratch/build/orte/tools/wrappers'
Making install in tools/orte-info
make[2]: Entering directory '/scratch/build/orte/tools/orte-info'
  CC       orte-info.o
  CC       output.o
  CC       param.o
  CC       components.o
  CC       version.o
  GENERATE orte-info.1
  CCLD     orte-info
make[3]: Entering directory '/scratch/build/orte/tools/orte-info'
 /usr/bin/mkdir -p '/opt/openmpi/bin'
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/mkdir -p '/opt/openmpi/share/man/man1'
  /bin/bash ../../../libtool   --mode=install /usr/bin/install -c orte-info '/opt/openmpi/bin'
 /usr/bin/install -c -m 644 ../../../../openmpi/orte/tools/orte-info/help-orte-info.txt '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 orte-info.1 '/opt/openmpi/share/man/man1'
libtool: install: /usr/bin/install -c .libs/orte-info /opt/openmpi/bin/orte-info
make[3]: Leaving directory '/scratch/build/orte/tools/orte-info'
make[2]: Leaving directory '/scratch/build/orte/tools/orte-info'
Making install in tools/orte-server
make[2]: Entering directory '/scratch/build/orte/tools/orte-server'
  CC       orte-server.o
  GENERATE orte-server.1
  CCLD     orte-server
make[3]: Entering directory '/scratch/build/orte/tools/orte-server'
 /usr/bin/mkdir -p '/opt/openmpi/bin'
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/mkdir -p '/opt/openmpi/share/man/man1'
  /bin/bash ../../../libtool   --mode=install /usr/bin/install -c orte-server '/opt/openmpi/bin'
 /usr/bin/install -c -m 644 ../../../../openmpi/orte/tools/orte-server/help-orte-server.txt '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 orte-server.1 '/opt/openmpi/share/man/man1'
libtool: install: /usr/bin/install -c .libs/orte-server /opt/openmpi/bin/orte-server
make[3]: Leaving directory '/scratch/build/orte/tools/orte-server'
make[2]: Leaving directory '/scratch/build/orte/tools/orte-server'
make[1]: Leaving directory '/scratch/build/orte'
Making install in ompi
make[1]: Entering directory '/scratch/build/ompi'
Making install in include
make[2]: Entering directory '/scratch/build/ompi/include'
  GENERATE mpif-sizeof.h
  GENERATE mpif-c-constants-decl.h
  LN_S     mpi_portable_platform.h
make[3]: Entering directory '/scratch/build/ompi/include'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/include'
 /usr/bin/install -c -m 644 mpi.h mpi-ext.h mpif.h mpif-ext.h mpif-sizeof.h mpif-c-constants-decl.h mpi_portable_platform.h '/opt/openmpi/include'
make[3]: Leaving directory '/scratch/build/ompi/include'
make[2]: Leaving directory '/scratch/build/ompi/include'
Making install in datatype
make[2]: Entering directory '/scratch/build/ompi/datatype'
  CC       ompi_datatype_args.lo
  CC       ompi_datatype_create.lo
  CC       ompi_datatype_create_indexed.lo
  CC       ompi_datatype_create_contiguous.lo
  CC       ompi_datatype_create_struct.lo
  CC       ompi_datatype_create_vector.lo
  CC       ompi_datatype_create_darray.lo
  CC       ompi_datatype_create_subarray.lo
  CC       ompi_datatype_external.lo
  CC       ompi_datatype_external32.lo
  CC       ompi_datatype_match_size.lo
  CC       ompi_datatype_module.lo
  CC       ompi_datatype_sndrcv.lo
  CC       ompi_datatype_get_elements.lo
  CCLD     libdatatype.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/ompi/datatype'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Leaving directory '/scratch/build/ompi/datatype'
make[2]: Leaving directory '/scratch/build/ompi/datatype'
Making install in debuggers
make[2]: Entering directory '/scratch/build/ompi/debuggers'
  CC       libdebuggers_la-ompi_debuggers.lo
  CC       ompi_debugger_canary.lo
  CC       libompi_dbg_msgq_la-ompi_msgq_dll.lo
  CC       libompi_dbg_msgq_la-ompi_common_dll.lo
  CCLD     libompi_dbg_msgq.la
  CCLD     libompi_debugger_canary.la
  CCLD     libdebuggers.la
ar: `u' modifier ignored since `D' is the default (see `U')
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/ompi/debuggers'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../libtool   --mode=install /usr/bin/install -c   libompi_dbg_msgq.la '/opt/openmpi/lib/openmpi'
libtool: install: /usr/bin/install -c .libs/libompi_dbg_msgq.so /opt/openmpi/lib/openmpi/libompi_dbg_msgq.so
libtool: install: /usr/bin/install -c .libs/libompi_dbg_msgq.lai /opt/openmpi/lib/openmpi/libompi_dbg_msgq.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/ompi/debuggers'
make[2]: Leaving directory '/scratch/build/ompi/debuggers'
Making install in etc
make[2]: Entering directory '/scratch/build/ompi/etc'
make[3]: Entering directory '/scratch/build/ompi/etc'
make[3]: Nothing to be done for 'install-exec-am'.
/usr/bin/mkdir -p /opt/openmpi/etc
 /usr/bin/install -c -m 644 ../../../openmpi/ompi/etc/openmpi-totalview.tcl /opt/openmpi/etc/openmpi-totalview.tcl
make[3]: Leaving directory '/scratch/build/ompi/etc'
make[2]: Leaving directory '/scratch/build/ompi/etc'
Making install in mpi/c
make[2]: Entering directory '/scratch/build/ompi/mpi/c'
Making install in profile
make[3]: Entering directory '/scratch/build/ompi/mpi/c/profile'
  LN_S     pabort.c
  LN_S     padd_error_class.c
  LN_S     padd_error_code.c
  LN_S     padd_error_string.c
  LN_S     piallgather.c
  LN_S     pallgather.c
  LN_S     piallgatherv.c
  LN_S     pallgatherv.c
  LN_S     palloc_mem.c
  LN_S     pallreduce.c
  LN_S     piallreduce.c
  LN_S     palltoall.c
  LN_S     pialltoall.c
  LN_S     palltoallv.c
  LN_S     pialltoallv.c
  LN_S     palltoallw.c
  LN_S     pialltoallw.c
  LN_S     pattr_delete.c
  LN_S     pattr_get.c
  LN_S     pattr_put.c
  LN_S     pbarrier.c
  LN_S     pibarrier.c
  LN_S     pbcast.c
  LN_S     pibcast.c
  LN_S     pbsend_init.c
  LN_S     pbsend.c
  LN_S     pbuffer_attach.c
  LN_S     pbuffer_detach.c
  LN_S     pcancel.c
  LN_S     pcart_coords.c
  LN_S     pcart_create.c
  LN_S     pcartdim_get.c
  LN_S     pcart_map.c
  LN_S     pcart_get.c
  LN_S     pcart_rank.c
  LN_S     pcart_shift.c
  LN_S     pcart_sub.c
  LN_S     pcomm_accept.c
  LN_S     pclose_port.c
  LN_S     pcomm_c2f.c
  LN_S     pcomm_call_errhandler.c
  LN_S     pcomm_connect.c
  LN_S     pcomm_compare.c
  LN_S     pcomm_create.c
  LN_S     pcomm_create_errhandler.c
  LN_S     pcomm_create_group.c
  LN_S     pcomm_create_keyval.c
  LN_S     pcomm_delete_attr.c
  LN_S     pcomm_disconnect.c
  LN_S     pcomm_dup.c
  LN_S     pcomm_dup_with_info.c
  LN_S     pcomm_idup.c
  LN_S     pcomm_f2c.c
  LN_S     pcomm_free.c
  LN_S     pcomm_free_keyval.c
  LN_S     pcomm_get_attr.c
  LN_S     pcomm_get_errhandler.c
  LN_S     pcomm_get_info.c
  LN_S     pcomm_get_name.c
  LN_S     pcomm_get_parent.c
  LN_S     pcomm_group.c
  LN_S     pcomm_join.c
  LN_S     pcomm_rank.c
  LN_S     pcomm_remote_group.c
  LN_S     pcomm_remote_size.c
  LN_S     pcomm_set_attr.c
  LN_S     pcomm_set_info.c
  LN_S     pdist_graph_create_adjacent.c
  LN_S     pdist_graph_create.c
  LN_S     pdist_graph_neighbors.c
  LN_S     pdist_graph_neighbors_count.c
  LN_S     pcomm_set_name.c
  LN_S     pcomm_set_errhandler.c
  LN_S     pcomm_spawn.c
  LN_S     pcomm_size.c
  LN_S     pcomm_spawn_multiple.c
  LN_S     pcomm_split.c
  LN_S     pcomm_split_type.c
  LN_S     pcompare_and_swap.c
  LN_S     pcomm_test_inter.c
  LN_S     pdims_create.c
  LN_S     perrhandler_c2f.c
  LN_S     perrhandler_free.c
  LN_S     perrhandler_f2c.c
  LN_S     perror_class.c
  LN_S     perror_string.c
  LN_S     pexscan.c
  LN_S     pfetch_and_op.c
  LN_S     piexscan.c
  LN_S     pfile_c2f.c
  LN_S     pfile_call_errhandler.c
  LN_S     pfile_create_errhandler.c
  LN_S     pfile_close.c
  LN_S     pfile_delete.c
  LN_S     pfile_f2c.c
  LN_S     pfile_get_amode.c
  LN_S     pfile_get_atomicity.c
  LN_S     pfile_get_byte_offset.c
  LN_S     pfile_get_errhandler.c
  LN_S     pfile_get_group.c
  LN_S     pfile_get_info.c
  LN_S     pfile_get_position.c
  LN_S     pfile_get_size.c
  LN_S     pfile_get_position_shared.c
  LN_S     pfile_get_view.c
  LN_S     pfile_get_type_extent.c
  LN_S     pfile_iread_at.c
  LN_S     pfile_iread.c
  LN_S     pfile_iread_at_all.c
  LN_S     pfile_iread_all.c
  LN_S     pfile_iwrite_at.c
  LN_S     pfile_iwrite_at_all.c
  LN_S     pfile_iread_shared.c
  LN_S     pfile_iwrite.c
  LN_S     pfile_iwrite_all.c
  LN_S     pfile_open.c
  LN_S     pfile_iwrite_shared.c
  LN_S     pfile_preallocate.c
  LN_S     pfile_read_all_begin.c
  LN_S     pfile_read_all.c
  LN_S     pfile_read_all_end.c
  LN_S     pfile_read_at_all_begin.c
  LN_S     pfile_read_at_all.c
  LN_S     pfile_read_at_all_end.c
  LN_S     pfile_read_at.c
  LN_S     pfile_read.c
  LN_S     pfile_read_ordered.c
  LN_S     pfile_read_ordered_begin.c
  LN_S     pfile_read_ordered_end.c
  LN_S     pfile_read_shared.c
  LN_S     pfile_seek.c
  LN_S     pfile_seek_shared.c
  LN_S     pfile_set_atomicity.c
  LN_S     pfile_set_errhandler.c
  LN_S     pfile_set_info.c
  LN_S     pfile_set_size.c
  LN_S     pfile_set_view.c
  LN_S     pfile_sync.c
  LN_S     pfile_write_all_begin.c
  LN_S     pfile_write_all.c
  LN_S     pfile_write_all_end.c
  LN_S     pfile_write_at_all_begin.c
  LN_S     pfile_write_at_all.c
  LN_S     pfile_write_at_all_end.c
  LN_S     pfile_write_at.c
  LN_S     pfile_write.c
  LN_S     pfile_write_ordered_begin.c
  LN_S     pfile_write_ordered.c
  LN_S     pfile_write_ordered_end.c
  LN_S     pfile_write_shared.c
  LN_S     pfinalize.c
  LN_S     pfinalized.c
  LN_S     pfree_mem.c
  LN_S     pgather.c
  LN_S     pigather.c
  LN_S     pgatherv.c
  LN_S     pigatherv.c
  LN_S     pget_address.c
  LN_S     pget_count.c
  LN_S     pget_elements.c
  LN_S     pget_elements_x.c
  LN_S     pget_accumulate.c
  LN_S     pget_processor_name.c
  LN_S     pget_library_version.c
  LN_S     pget_version.c
  LN_S     pgraph_create.c
  LN_S     pgraph_get.c
  LN_S     pgraph_map.c
  LN_S     pgraph_neighbors_count.c
  LN_S     pgraph_neighbors.c
  LN_S     pgraphdims_get.c
  LN_S     pgrequest_complete.c
  LN_S     pgrequest_start.c
  LN_S     pgroup_c2f.c
  LN_S     pgroup_compare.c
  LN_S     pgroup_difference.c
  LN_S     pgroup_f2c.c
  LN_S     pgroup_excl.c
  LN_S     pgroup_free.c
  LN_S     pgroup_incl.c
  LN_S     pgroup_intersection.c
  LN_S     pgroup_range_excl.c
  LN_S     pgroup_range_incl.c
  LN_S     pgroup_size.c
  LN_S     pgroup_rank.c
  LN_S     pgroup_translate_ranks.c
  LN_S     pgroup_union.c
  LN_S     pibsend.c
  LN_S     pimprobe.c
  LN_S     pinfo_c2f.c
  LN_S     pimrecv.c
  LN_S     pinfo_delete.c
  LN_S     pinfo_create.c
  LN_S     pinfo_dup.c
  LN_S     pinfo_f2c.c
  LN_S     pinfo_free.c
  LN_S     pinfo_get.c
  LN_S     pinfo_get_nkeys.c
  LN_S     pinfo_get_nthkey.c
  LN_S     pinfo_get_valuelen.c
  LN_S     pinit.c
  LN_S     pinfo_set.c
  LN_S     pinit_thread.c
  LN_S     pinitialized.c
  LN_S     pintercomm_create.c
  LN_S     pintercomm_merge.c
  LN_S     piprobe.c
  LN_S     pirecv.c
  LN_S     pirsend.c
  LN_S     pis_thread_main.c
  LN_S     pisend.c
  LN_S     pissend.c
  LN_S     plookup_name.c
  LN_S     pmessage_f2c.c
  LN_S     pmessage_c2f.c
  LN_S     pmprobe.c
  LN_S     pneighbor_allgather.c
  LN_S     pmrecv.c
  LN_S     pineighbor_allgather.c
  LN_S     pneighbor_allgatherv.c
  LN_S     pineighbor_allgatherv.c
  LN_S     pneighbor_alltoall.c
  LN_S     pineighbor_alltoall.c
  LN_S     pneighbor_alltoallv.c
  LN_S     pineighbor_alltoallv.c
  LN_S     pneighbor_alltoallw.c
  LN_S     pineighbor_alltoallw.c
  LN_S     pkeyval_create.c
  LN_S     pkeyval_free.c
  LN_S     pop_c2f.c
  LN_S     pop_create.c
  LN_S     pop_commutative.c
  LN_S     pop_f2c.c
  LN_S     pop_free.c
  LN_S     ppack_external.c
  LN_S     popen_port.c
  LN_S     ppack_external_size.c
  LN_S     ppack.c
  LN_S     ppcontrol.c
  LN_S     ppack_size.c
  LN_S     pprobe.c
  LN_S     ppublish_name.c
  LN_S     pquery_thread.c
  LN_S     praccumulate.c
  LN_S     precv_init.c
  LN_S     precv.c
  LN_S     preduce.c
  LN_S     pregister_datarep.c
  LN_S     pireduce.c
  LN_S     preduce_local.c
  LN_S     preduce_scatter.c
  LN_S     pireduce_scatter.c
  LN_S     preduce_scatter_block.c
  LN_S     pireduce_scatter_block.c
  LN_S     prequest_f2c.c
  LN_S     prequest_c2f.c
  LN_S     prequest_free.c
  LN_S     prequest_get_status.c
  LN_S     prget.c
  LN_S     prget_accumulate.c
  LN_S     prput.c
  LN_S     prsend_init.c
  LN_S     prsend.c
  LN_S     pscan.c
  LN_S     piscan.c
  LN_S     pscatter.c
  LN_S     piscatter.c
  LN_S     pscatterv.c
  LN_S     piscatterv.c
  LN_S     psend.c
  LN_S     psend_init.c
  LN_S     psendrecv_replace.c
  LN_S     psendrecv.c
  LN_S     pssend_init.c
  LN_S     pssend.c
  LN_S     pstart.c
  LN_S     pstartall.c
  LN_S     pstatus_f2c.c
  LN_S     pstatus_c2f.c
  LN_S     pstatus_set_elements.c
  LN_S     pstatus_set_cancelled.c
  LN_S     pstatus_set_elements_x.c
  LN_S     ptestall.c
  LN_S     ptest.c
  LN_S     ptestany.c
  LN_S     ptest_cancelled.c
  LN_S     ptestsome.c
  LN_S     ptopo_test.c
  LN_S     ptype_c2f.c
  LN_S     ptype_commit.c
  LN_S     ptype_contiguous.c
  LN_S     ptype_create_darray.c
  LN_S     ptype_create_f90_complex.c
  LN_S     ptype_create_f90_real.c
  LN_S     ptype_create_f90_integer.c
  LN_S     ptype_create_hindexed.c
  LN_S     ptype_create_hvector.c
  LN_S     ptype_create_indexed_block.c
  LN_S     ptype_create_hindexed_block.c
  LN_S     ptype_create_keyval.c
  LN_S     ptype_create_resized.c
  LN_S     ptype_create_struct.c
  LN_S     ptype_create_subarray.c
  LN_S     ptype_dup.c
  LN_S     ptype_delete_attr.c
  LN_S     ptype_f2c.c
  LN_S     ptype_free.c
  LN_S     ptype_free_keyval.c
  LN_S     ptype_get_attr.c
  LN_S     ptype_get_contents.c
  LN_S     ptype_get_envelope.c
  LN_S     ptype_get_extent.c
  LN_S     ptype_get_extent_x.c
  LN_S     ptype_get_name.c
  LN_S     ptype_get_true_extent.c
  LN_S     ptype_get_true_extent_x.c
  LN_S     ptype_indexed.c
  LN_S     ptype_match_size.c
  LN_S     ptype_set_attr.c
  LN_S     ptype_set_name.c
  LN_S     ptype_size.c
  LN_S     ptype_size_x.c
  LN_S     ptype_vector.c
  LN_S     punpack_external.c
  LN_S     punpack.c
  LN_S     punpublish_name.c
  LN_S     pwait.c
  LN_S     pwaitsome.c
  LN_S     pwaitany.c
  LN_S     pwaitall.c
  LN_S     pwtime.c
  LN_S     paccumulate.c
  LN_S     pget.c
  LN_S     pwtick.c
  LN_S     pput.c
  LN_S     pwin_allocate_shared.c
  LN_S     pwin_allocate.c
  LN_S     pwin_attach.c
  LN_S     pwin_c2f.c
  LN_S     pwin_call_errhandler.c
  LN_S     pwin_complete.c
  LN_S     pwin_create_errhandler.c
  LN_S     pwin_create_keyval.c
  LN_S     pwin_create.c
  LN_S     pwin_create_dynamic.c
  LN_S     pwin_delete_attr.c
  LN_S     pwin_detach.c
  LN_S     pwin_fence.c
  LN_S     pwin_f2c.c
  LN_S     pwin_flush.c
  LN_S     pwin_flush_all.c
  LN_S     pwin_flush_local.c
  LN_S     pwin_flush_local_all.c
  LN_S     pwin_free_keyval.c
  LN_S     pwin_free.c
  LN_S     pwin_get_attr.c
  LN_S     pwin_get_errhandler.c
  LN_S     pwin_get_group.c
  LN_S     pwin_get_info.c
  LN_S     pwin_get_name.c
  LN_S     pwin_lock.c
  LN_S     pwin_lock_all.c
  LN_S     pwin_post.c
  LN_S     pwin_set_attr.c
  LN_S     pwin_set_errhandler.c
  LN_S     pwin_set_info.c
  LN_S     pwin_shared_query.c
  LN_S     pwin_set_name.c
  LN_S     pwin_start.c
  LN_S     pwin_test.c
  LN_S     pwin_unlock.c
  LN_S     pwin_sync.c
  LN_S     pwin_unlock_all.c
  LN_S     pwin_wait.c
  LN_S     perrhandler_create.c
  LN_S     paddress.c
  LN_S     perrhandler_get.c
  LN_S     perrhandler_set.c
  LN_S     ptype_extent.c
  LN_S     ptype_hindexed.c
  LN_S     ptype_hvector.c
  LN_S     ptype_lb.c
  LN_S     ptype_struct.c
  LN_S     ptype_ub.c
  CC       pabort.lo
  CC       padd_error_class.lo
  CC       padd_error_code.lo
  CC       padd_error_string.lo
  CC       pallgather.lo
  CC       piallgather.lo
  CC       pallgatherv.lo
  CC       piallgatherv.lo
  CC       palloc_mem.lo
  CC       pallreduce.lo
  CC       piallreduce.lo
  CC       palltoall.lo
  CC       pialltoall.lo
  CC       palltoallv.lo
  CC       pialltoallv.lo
  CC       palltoallw.lo
  CC       pialltoallw.lo
  CC       pattr_delete.lo
  CC       pattr_get.lo
  CC       pattr_put.lo
  CC       pbarrier.lo
  CC       pibarrier.lo
  CC       pbcast.lo
  CC       pibcast.lo
  CC       pbsend.lo
  CC       pbsend_init.lo
  CC       pbuffer_attach.lo
  CC       pbuffer_detach.lo
  CC       pcancel.lo
  CC       pcart_coords.lo
  CC       pcart_create.lo
  CC       pcartdim_get.lo
  CC       pcart_get.lo
  CC       pcart_map.lo
  CC       pcart_rank.lo
  CC       pcart_shift.lo
  CC       pcart_sub.lo
  CC       pclose_port.lo
  CC       pcomm_accept.lo
  CC       pcomm_c2f.lo
  CC       pcomm_call_errhandler.lo
  CC       pcomm_compare.lo
  CC       pcomm_connect.lo
  CC       pcomm_create.lo
  CC       pcomm_create_errhandler.lo
  CC       pcomm_create_group.lo
  CC       pcomm_create_keyval.lo
  CC       pcomm_delete_attr.lo
  CC       pcomm_disconnect.lo
  CC       pcomm_dup.lo
  CC       pcomm_dup_with_info.lo
  CC       pcomm_idup.lo
  CC       pcomm_f2c.lo
  CC       pcomm_free.lo
  CC       pcomm_free_keyval.lo
  CC       pcomm_get_attr.lo
  CC       pcomm_get_errhandler.lo
  CC       pcomm_get_info.lo
  CC       pcomm_get_name.lo
  CC       pcomm_get_parent.lo
  CC       pcomm_group.lo
  CC       pcomm_join.lo
  CC       pcomm_rank.lo
  CC       pcomm_remote_group.lo
  CC       pcomm_remote_size.lo
  CC       pcomm_set_attr.lo
  CC       pcomm_set_info.lo
  CC       pdist_graph_create.lo
  CC       pdist_graph_create_adjacent.lo
  CC       pdist_graph_neighbors.lo
  CC       pdist_graph_neighbors_count.lo
  CC       pcomm_set_errhandler.lo
  CC       pcomm_set_name.lo
  CC       pcomm_size.lo
  CC       pcomm_spawn.lo
  CC       pcomm_spawn_multiple.lo
  CC       pcomm_split.lo
  CC       pcomm_split_type.lo
  CC       pcomm_test_inter.lo
  CC       pcompare_and_swap.lo
  CC       pdims_create.lo
  CC       perrhandler_c2f.lo
  CC       perrhandler_f2c.lo
  CC       perrhandler_free.lo
  CC       perror_class.lo
  CC       perror_string.lo
  CC       pexscan.lo
  CC       pfetch_and_op.lo
  CC       piexscan.lo
  CC       pfile_c2f.lo
  CC       pfile_call_errhandler.lo
  CC       pfile_close.lo
  CC       pfile_create_errhandler.lo
  CC       pfile_delete.lo
  CC       pfile_f2c.lo
  CC       pfile_get_amode.lo
  CC       pfile_get_atomicity.lo
  CC       pfile_get_byte_offset.lo
  CC       pfile_get_errhandler.lo
  CC       pfile_get_group.lo
  CC       pfile_get_info.lo
  CC       pfile_get_position.lo
  CC       pfile_get_position_shared.lo
  CC       pfile_get_size.lo
  CC       pfile_get_type_extent.lo
  CC       pfile_get_view.lo
  CC       pfile_iread_at.lo
  CC       pfile_iread.lo
  CC       pfile_iread_at_all.lo
  CC       pfile_iread_all.lo
  CC       pfile_iread_shared.lo
  CC       pfile_iwrite_at.lo
  CC       pfile_iwrite.lo
  CC       pfile_iwrite_at_all.lo
  CC       pfile_iwrite_all.lo
  CC       pfile_iwrite_shared.lo
  CC       pfile_open.lo
  CC       pfile_preallocate.lo
  CC       pfile_read_all_begin.lo
  CC       pfile_read_all.lo
  CC       pfile_read_all_end.lo
  CC       pfile_read_at_all_begin.lo
  CC       pfile_read_at_all.lo
  CC       pfile_read_at_all_end.lo
  CC       pfile_read_at.lo
  CC       pfile_read.lo
  CC       pfile_read_ordered_begin.lo
  CC       pfile_read_ordered.lo
  CC       pfile_read_ordered_end.lo
  CC       pfile_read_shared.lo
  CC       pfile_seek.lo
  CC       pfile_seek_shared.lo
  CC       pfile_set_atomicity.lo
  CC       pfile_set_errhandler.lo
  CC       pfile_set_info.lo
  CC       pfile_set_size.lo
  CC       pfile_set_view.lo
  CC       pfile_sync.lo
  CC       pfile_write_all_begin.lo
  CC       pfile_write_all.lo
  CC       pfile_write_all_end.lo
  CC       pfile_write_at_all_begin.lo
  CC       pfile_write_at_all.lo
  CC       pfile_write_at_all_end.lo
  CC       pfile_write_at.lo
  CC       pfile_write.lo
  CC       pfile_write_ordered_begin.lo
  CC       pfile_write_ordered.lo
  CC       pfile_write_ordered_end.lo
  CC       pfile_write_shared.lo
  CC       pfinalize.lo
  CC       pfinalized.lo
  CC       pfree_mem.lo
  CC       pgather.lo
  CC       pigather.lo
  CC       pgatherv.lo
  CC       pigatherv.lo
  CC       pget_address.lo
  CC       pget_count.lo
  CC       pget_elements.lo
  CC       pget_elements_x.lo
  CC       pget_accumulate.lo
  CC       pget_library_version.lo
  CC       pget_processor_name.lo
  CC       pget_version.lo
  CC       pgraph_create.lo
  CC       pgraph_get.lo
  CC       pgraph_map.lo
  CC       pgraph_neighbors_count.lo
  CC       pgraph_neighbors.lo
  CC       pgraphdims_get.lo
  CC       pgrequest_complete.lo
  CC       pgrequest_start.lo
  CC       pgroup_c2f.lo
  CC       pgroup_compare.lo
  CC       pgroup_difference.lo
  CC       pgroup_excl.lo
  CC       pgroup_f2c.lo
  CC       pgroup_free.lo
  CC       pgroup_incl.lo
  CC       pgroup_intersection.lo
  CC       pgroup_range_excl.lo
  CC       pgroup_range_incl.lo
  CC       pgroup_rank.lo
  CC       pgroup_size.lo
  CC       pgroup_translate_ranks.lo
  CC       pgroup_union.lo
  CC       pibsend.lo
  CC       pimprobe.lo
  CC       pimrecv.lo
  CC       pinfo_c2f.lo
  CC       pinfo_create.lo
  CC       pinfo_delete.lo
  CC       pinfo_dup.lo
  CC       pinfo_f2c.lo
  CC       pinfo_free.lo
  CC       pinfo_get.lo
  CC       pinfo_get_nkeys.lo
  CC       pinfo_get_nthkey.lo
  CC       pinfo_get_valuelen.lo
  CC       pinfo_set.lo
  CC       pinit.lo
  CC       pinit_thread.lo
  CC       pinitialized.lo
  CC       pintercomm_create.lo
  CC       pintercomm_merge.lo
  CC       piprobe.lo
  CC       pirecv.lo
  CC       pirsend.lo
  CC       pis_thread_main.lo
  CC       pisend.lo
  CC       pissend.lo
  CC       plookup_name.lo
  CC       pmessage_f2c.lo
  CC       pmessage_c2f.lo
  CC       pmprobe.lo
  CC       pmrecv.lo
  CC       pneighbor_allgather.lo
  CC       pineighbor_allgather.lo
  CC       pneighbor_allgatherv.lo
  CC       pineighbor_allgatherv.lo
  CC       pneighbor_alltoall.lo
  CC       pineighbor_alltoall.lo
  CC       pneighbor_alltoallv.lo
  CC       pineighbor_alltoallv.lo
  CC       pneighbor_alltoallw.lo
  CC       pineighbor_alltoallw.lo
  CC       pkeyval_create.lo
  CC       pkeyval_free.lo
  CC       pop_c2f.lo
  CC       pop_create.lo
  CC       pop_commutative.lo
  CC       pop_f2c.lo
  CC       pop_free.lo
  CC       popen_port.lo
  CC       ppack_external.lo
  CC       ppack_external_size.lo
  CC       ppack.lo
  CC       ppack_size.lo
  CC       ppcontrol.lo
  CC       pprobe.lo
  CC       ppublish_name.lo
  CC       pquery_thread.lo
  CC       praccumulate.lo
  CC       precv_init.lo
  CC       precv.lo
  CC       preduce.lo
  CC       pregister_datarep.lo
  CC       pireduce.lo
  CC       preduce_local.lo
  CC       preduce_scatter.lo
  CC       pireduce_scatter.lo
  CC       preduce_scatter_block.lo
  CC       pireduce_scatter_block.lo
  CC       prequest_c2f.lo
  CC       prequest_f2c.lo
  CC       prequest_free.lo
  CC       prequest_get_status.lo
  CC       prget.lo
  CC       prget_accumulate.lo
  CC       prput.lo
  CC       prsend_init.lo
  CC       prsend.lo
  CC       pscan.lo
  CC       piscan.lo
  CC       pscatter.lo
  CC       piscatter.lo
  CC       pscatterv.lo
  CC       piscatterv.lo
  CC       psend.lo
  CC       psend_init.lo
  CC       psendrecv.lo
  CC       psendrecv_replace.lo
  CC       pssend_init.lo
  CC       pssend.lo
  CC       pstart.lo
  CC       pstartall.lo
  CC       pstatus_c2f.lo
  CC       pstatus_f2c.lo
  CC       pstatus_set_cancelled.lo
  CC       pstatus_set_elements.lo
  CC       pstatus_set_elements_x.lo
  CC       ptestall.lo
  CC       ptestany.lo
  CC       ptest.lo
  CC       ptest_cancelled.lo
  CC       ptestsome.lo
  CC       ptopo_test.lo
  CC       ptype_c2f.lo
  CC       ptype_commit.lo
  CC       ptype_contiguous.lo
  CC       ptype_create_darray.lo
  CC       ptype_create_f90_complex.lo
  CC       ptype_create_f90_integer.lo
  CC       ptype_create_f90_real.lo
  CC       ptype_create_hindexed.lo
  CC       ptype_create_hvector.lo
  CC       ptype_create_indexed_block.lo
  CC       ptype_create_hindexed_block.lo
  CC       ptype_create_keyval.lo
  CC       ptype_create_resized.lo
  CC       ptype_create_struct.lo
  CC       ptype_create_subarray.lo
  CC       ptype_delete_attr.lo
  CC       ptype_dup.lo
  CC       ptype_f2c.lo
  CC       ptype_free.lo
  CC       ptype_free_keyval.lo
  CC       ptype_get_attr.lo
  CC       ptype_get_contents.lo
  CC       ptype_get_envelope.lo
  CC       ptype_get_extent.lo
  CC       ptype_get_extent_x.lo
  CC       ptype_get_name.lo
  CC       ptype_get_true_extent.lo
  CC       ptype_get_true_extent_x.lo
  CC       ptype_indexed.lo
  CC       ptype_match_size.lo
  CC       ptype_set_attr.lo
  CC       ptype_set_name.lo
  CC       ptype_size.lo
  CC       ptype_size_x.lo
  CC       ptype_vector.lo
  CC       punpack_external.lo
  CC       punpack.lo
  CC       punpublish_name.lo
  CC       pwait.lo
  CC       pwaitall.lo
  CC       pwaitany.lo
  CC       pwaitsome.lo
  CC       pwtime.lo
  CC       pwtick.lo
  CC       paccumulate.lo
  CC       pget.lo
  CC       pput.lo
  CC       pwin_allocate.lo
  CC       pwin_allocate_shared.lo
  CC       pwin_attach.lo
  CC       pwin_c2f.lo
  CC       pwin_call_errhandler.lo
  CC       pwin_complete.lo
  CC       pwin_create_keyval.lo
  CC       pwin_create_errhandler.lo
  CC       pwin_create.lo
Get:16 https://apt.repos.intel.com/oneapi all/main amd64 intel-oneapi-compiler-shared-runtime-2023.0.0 amd64 2023.0.0-25370 [144 MB]
  CC       pwin_create_dynamic.lo
  CC       pwin_delete_attr.lo
  CC       pwin_detach.lo
  CC       pwin_f2c.lo
  CC       pwin_fence.lo
  CC       pwin_flush.lo
  CC       pwin_flush_all.lo
  CC       pwin_flush_local.lo
  CC       pwin_flush_local_all.lo
  CC       pwin_free_keyval.lo
  CC       pwin_free.lo
  CC       pwin_get_attr.lo
  CC       pwin_get_errhandler.lo
  CC       pwin_get_group.lo
  CC       pwin_get_info.lo
  CC       pwin_get_name.lo
  CC       pwin_lock.lo
  CC       pwin_lock_all.lo
  CC       pwin_post.lo
  CC       pwin_set_attr.lo
  CC       pwin_set_errhandler.lo
  CC       pwin_set_info.lo
  CC       pwin_set_name.lo
  CC       pwin_shared_query.lo
  CC       pwin_start.lo
  CC       pwin_sync.lo
  CC       pwin_test.lo
  CC       pwin_unlock.lo
  CC       pwin_unlock_all.lo
  CC       pwin_wait.lo
  CC       paddress.lo
  CC       perrhandler_create.lo
  CC       perrhandler_get.lo
  CC       perrhandler_set.lo
  CC       ptype_extent.lo
  CC       ptype_hindexed.lo
  CC       ptype_hvector.lo
  CC       ptype_lb.lo
  CC       ptype_struct.lo
  CC       ptype_ub.lo
  CCLD     libmpi_c_pmpi.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[4]: Entering directory '/scratch/build/ompi/mpi/c/profile'
make[4]: Nothing to be done for 'install-exec-am'.
make[4]: Nothing to be done for 'install-data-am'.
make[4]: Leaving directory '/scratch/build/ompi/mpi/c/profile'
make[3]: Leaving directory '/scratch/build/ompi/mpi/c/profile'
make[3]: Entering directory '/scratch/build/ompi/mpi/c'
  CC       attr_fn.lo
  CCLD     libmpi_c.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[4]: Entering directory '/scratch/build/ompi/mpi/c'
make[4]: Nothing to be done for 'install-exec-am'.
make[4]: Leaving directory '/scratch/build/ompi/mpi/c'
make[3]: Leaving directory '/scratch/build/ompi/mpi/c'
make[2]: Leaving directory '/scratch/build/ompi/mpi/c'
Making install in mpi/tool
make[2]: Entering directory '/scratch/build/ompi/mpi/tool'
Making install in profile
make[3]: Entering directory '/scratch/build/ompi/mpi/tool/profile'
  LN_S     pcategory_changed.c
  LN_S     pcategory_get_categories.c
  LN_S     pcategory_get_cvars.c
  LN_S     pcategory_get_info.c
  LN_S     pcategory_get_index.c
  LN_S     pcategory_get_num.c
  LN_S     pcvar_get_info.c
  LN_S     pcategory_get_pvars.c
  LN_S     pcvar_get_num.c
  LN_S     pcvar_get_index.c
  LN_S     pcvar_handle_alloc.c
  LN_S     pcvar_handle_free.c
  LN_S     pcvar_read.c
  LN_S     pcvar_write.c
  LN_S     penum_get_info.c
  LN_S     penum_get_item.c
  LN_S     pfinalize.c
  LN_S     pinit_thread.c
  LN_S     ppvar_get_info.c
  LN_S     ppvar_get_index.c
  LN_S     ppvar_get_num.c
  LN_S     ppvar_handle_alloc.c
  LN_S     ppvar_handle_free.c
  LN_S     ppvar_read.c
  LN_S     ppvar_readreset.c
  LN_S     ppvar_reset.c
  LN_S     ppvar_session_create.c
  LN_S     ppvar_session_free.c
  LN_S     ppvar_start.c
  LN_S     ppvar_stop.c
  LN_S     ppvar_write.c
  CC       pcategory_changed.lo
  CC       pcategory_get_categories.lo
  CC       pcategory_get_cvars.lo
  CC       pcategory_get_info.lo
  CC       pcategory_get_index.lo
  CC       pcategory_get_pvars.lo
  CC       pcategory_get_num.lo
  CC       pcvar_get_info.lo
  CC       pcvar_get_index.lo
  CC       pcvar_get_num.lo
  CC       pcvar_handle_alloc.lo
  CC       pcvar_handle_free.lo
  CC       pcvar_read.lo
  CC       pcvar_write.lo
  CC       penum_get_info.lo
  CC       penum_get_item.lo
  CC       pfinalize.lo
  CC       pinit_thread.lo
  CC       ppvar_get_info.lo
  CC       ppvar_get_index.lo
Get:17 https://apt.repos.intel.com/oneapi all/main all intel-oneapi-tbb-common-2021.8.0 all 2021.8.0-25334 [21.2 kB]
Get:18 https://apt.repos.intel.com/oneapi all/main amd64 intel-oneapi-tbb-2021.8.0 amd64 2021.8.0-25334 [2162 kB]
Get:19 https://apt.repos.intel.com/oneapi all/main amd64 intel-oneapi-compiler-dpcpp-cpp-runtime-2023.0.0 amd64 2023.0.0-25370 [31.6 MB]
  CC       ppvar_get_num.lo
  CC       ppvar_handle_alloc.lo
  CC       ppvar_handle_free.lo
  CC       ppvar_read.lo
  CC       ppvar_readreset.lo
  CC       ppvar_reset.lo
  CC       ppvar_session_create.lo
  CC       ppvar_session_free.lo
  CC       ppvar_start.lo
  CC       ppvar_stop.lo
  CC       ppvar_write.lo
  CCLD     libmpi_pmpit.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[4]: Entering directory '/scratch/build/ompi/mpi/tool/profile'
make[4]: Nothing to be done for 'install-exec-am'.
make[4]: Leaving directory '/scratch/build/ompi/mpi/tool/profile'
make[3]: Leaving directory '/scratch/build/ompi/mpi/tool/profile'
make[3]: Entering directory '/scratch/build/ompi/mpi/tool'
  CC       mpit_common.lo
  CCLD     libmpi_mpit_common.la
ar: `u' modifier ignored since `D' is the default (see `U')
Get:20 https://apt.repos.intel.com/oneapi all/main all intel-oneapi-compiler-shared-common-2023.0.0 all 2023.0.0-25370 [100 MB]
make[4]: Entering directory '/scratch/build/ompi/mpi/tool'
make[4]: Nothing to be done for 'install-exec-am'.
make[4]: Leaving directory '/scratch/build/ompi/mpi/tool'
make[3]: Leaving directory '/scratch/build/ompi/mpi/tool'
make[2]: Leaving directory '/scratch/build/ompi/mpi/tool'
Making install in mpiext/affinity/c
make[2]: Entering directory '/scratch/build/ompi/mpiext/affinity/c'
  CC       mpiext_affinity_str.lo
  GENERATE OMPI_Affinity_str.3
  CCLD     libmpiext_affinity_c.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/ompi/mpiext/affinity/c'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/man/man3'
 /usr/bin/mkdir -p '/opt/openmpi/include/openmpi/mpiext/'
 /usr/bin/install -c -m 644 ../../../../../openmpi/ompi/mpiext/affinity/c/mpiext_affinity_c.h '/opt/openmpi/include/openmpi/mpiext/'
 /usr/bin/install -c -m 644 OMPI_Affinity_str.3 '/opt/openmpi/share/man/man3'
make[3]: Leaving directory '/scratch/build/ompi/mpiext/affinity/c'
make[2]: Leaving directory '/scratch/build/ompi/mpiext/affinity/c'
Making install in mpiext/cuda/c
make[2]: Entering directory '/scratch/build/ompi/mpiext/cuda/c'
  CC       mpiext_cuda.lo
  GENERATE MPIX_Query_cuda_support.3
  CCLD     libmpiext_cuda_c.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/ompi/mpiext/cuda/c'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/man/man3'
 /usr/bin/mkdir -p '/opt/openmpi/include/openmpi/mpiext'
 /usr/bin/install -c -m 644 mpiext_cuda_c.h '/opt/openmpi/include/openmpi/mpiext'
 /usr/bin/install -c -m 644 MPIX_Query_cuda_support.3 '/opt/openmpi/share/man/man3'
make[3]: Leaving directory '/scratch/build/ompi/mpiext/cuda/c'
make[2]: Leaving directory '/scratch/build/ompi/mpiext/cuda/c'
Making install in mpiext/pcollreq/c
make[2]: Entering directory '/scratch/build/ompi/mpiext/pcollreq/c'
Making install in profile
make[3]: Entering directory '/scratch/build/ompi/mpiext/pcollreq/c/profile'
  LN_S     pallgather_init.c
  LN_S     pallreduce_init.c
  LN_S     palltoall_init.c
  LN_S     pallgatherv_init.c
  LN_S     palltoallv_init.c
  LN_S     palltoallw_init.c
  LN_S     pbarrier_init.c
  LN_S     pbcast_init.c
  LN_S     pexscan_init.c
  LN_S     pgather_init.c
  LN_S     pgatherv_init.c
  LN_S     preduce_init.c
  LN_S     preduce_scatter_block_init.c
  LN_S     preduce_scatter_init.c
  LN_S     pscan_init.c
  LN_S     pscatterv_init.c
  LN_S     pscatter_init.c
  LN_S     pneighbor_allgather_init.c
  LN_S     pneighbor_allgatherv_init.c
  LN_S     pneighbor_alltoall_init.c
  LN_S     pneighbor_alltoallv_init.c
  LN_S     pneighbor_alltoallw_init.c
  CC       pallgather_init.lo
  CC       pallgatherv_init.lo
  CC       pallreduce_init.lo
  CC       palltoall_init.lo
  CC       palltoallv_init.lo
  CC       palltoallw_init.lo
  CC       pbarrier_init.lo
  CC       pbcast_init.lo
  CC       pexscan_init.lo
  CC       pgather_init.lo
  CC       pgatherv_init.lo
  CC       preduce_init.lo
  CC       preduce_scatter_block_init.lo
  CC       preduce_scatter_init.lo
  CC       pscan_init.lo
  CC       pscatter_init.lo
  CC       pscatterv_init.lo
  CC       pneighbor_allgather_init.lo
  CC       pneighbor_allgatherv_init.lo
  CC       pneighbor_alltoall_init.lo
  CC       pneighbor_alltoallv_init.lo
  CC       pneighbor_alltoallw_init.lo
  CCLD     libpmpiext_pcollreq_c.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[4]: Entering directory '/scratch/build/ompi/mpiext/pcollreq/c/profile'
make[4]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/include/openmpi/mpiext'
 /usr/bin/install -c -m 644 ../../../../../../openmpi/ompi/mpiext/pcollreq/c/profile/pmpiext_pcollreq_c.h '/opt/openmpi/include/openmpi/mpiext'
make[4]: Leaving directory '/scratch/build/ompi/mpiext/pcollreq/c/profile'
make[3]: Leaving directory '/scratch/build/ompi/mpiext/pcollreq/c/profile'
make[3]: Entering directory '/scratch/build/ompi/mpiext/pcollreq/c'
  CC       mpiext_pcollreq_c.lo
  GENERATE MPIX_Allgatherv_init.3
  GENERATE MPIX_Allgather_init.3
  GENERATE MPIX_Allreduce_init.3
  GENERATE MPIX_Alltoall_init.3
  GENERATE MPIX_Alltoallw_init.3
  GENERATE MPIX_Alltoallv_init.3
  GENERATE MPIX_Barrier_init.3
  GENERATE MPIX_Bcast_init.3
  GENERATE MPIX_Exscan_init.3
  GENERATE MPIX_Gatherv_init.3
  GENERATE MPIX_Reduce_init.3
  GENERATE MPIX_Gather_init.3
  GENERATE MPIX_Reduce_scatter_block_init.3
  GENERATE MPIX_Reduce_scatter_init.3
  GENERATE MPIX_Scan_init.3
  GENERATE MPIX_Scatter_init.3
  GENERATE MPIX_Scatterv_init.3
  GENERATE MPIX_Neighbor_allgather_init.3
  GENERATE MPIX_Neighbor_allgatherv_init.3
  GENERATE MPIX_Neighbor_alltoall_init.3
  GENERATE MPIX_Neighbor_alltoallv_init.3
  GENERATE MPIX_Neighbor_alltoallw_init.3
  CCLD     libmpiext_pcollreq_c.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[4]: Entering directory '/scratch/build/ompi/mpiext/pcollreq/c'
make[4]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/include/openmpi/mpiext'
 /usr/bin/mkdir -p '/opt/openmpi/share/man/man3'
 /usr/bin/install -c -m 644 ../../../../../openmpi/ompi/mpiext/pcollreq/c/mpiext_pcollreq_c.h '/opt/openmpi/include/openmpi/mpiext'
 /usr/bin/install -c -m 644 MPIX_Allgather_init.3 MPIX_Allgatherv_init.3 MPIX_Allreduce_init.3 MPIX_Alltoall_init.3 MPIX_Alltoallv_init.3 MPIX_Alltoallw_init.3 MPIX_Barrier_init.3 MPIX_Bcast_init.3 MPIX_Exscan_init.3 MPIX_Gather_init.3 MPIX_Gatherv_init.3 MPIX_Reduce_init.3 MPIX_Reduce_scatter_block_init.3 MPIX_Reduce_scatter_init.3 MPIX_Scan_init.3 MPIX_Scatter_init.3 MPIX_Scatterv_init.3 MPIX_Neighbor_allgather_init.3 MPIX_Neighbor_allgatherv_init.3 MPIX_Neighbor_alltoall_init.3 MPIX_Neighbor_alltoallv_init.3 MPIX_Neighbor_alltoallw_init.3 '/opt/openmpi/share/man/man3'
make[4]: Leaving directory '/scratch/build/ompi/mpiext/pcollreq/c'
make[3]: Leaving directory '/scratch/build/ompi/mpiext/pcollreq/c'
make[2]: Leaving directory '/scratch/build/ompi/mpiext/pcollreq/c'
Making install in mpi/fortran/base/
make[2]: Entering directory '/scratch/build/ompi/mpi/fortran/base'
make[3]: Entering directory '/scratch/build/ompi/mpi/fortran/base'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Nothing to be done for 'install-data-am'.
make[3]: Leaving directory '/scratch/build/ompi/mpi/fortran/base'
make[2]: Leaving directory '/scratch/build/ompi/mpi/fortran/base'
Making install in mca/common
make[2]: Entering directory '/scratch/build/ompi/mca/common'
make[3]: Entering directory '/scratch/build/ompi/mca/common'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Nothing to be done for 'install-data-am'.
make[3]: Leaving directory '/scratch/build/ompi/mca/common'
make[2]: Leaving directory '/scratch/build/ompi/mca/common'
Making install in mca/bml
make[2]: Entering directory '/scratch/build/ompi/mca/bml'
  CC       base/bml_base_btl.lo
  CC       base/bml_base_endpoint.lo
  CC       base/bml_base_init.lo
  CC       base/bml_base_frame.lo
  CC       base/bml_base_ft.lo
  CCLD     libmca_bml.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/ompi/mca/bml'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Leaving directory '/scratch/build/ompi/mca/bml'
make[2]: Leaving directory '/scratch/build/ompi/mca/bml'
Making install in mca/coll
make[2]: Entering directory '/scratch/build/ompi/mca/coll'
  CC       base/coll_base_comm_select.lo
  CC       base/coll_base_comm_unselect.lo
  CC       base/coll_base_frame.lo
  CC       base/coll_base_find_available.lo
  CC       base/coll_base_bcast.lo
  CC       base/coll_base_scatter.lo
  CC       base/coll_base_topo.lo
  CC       base/coll_base_allgather.lo
  CC       base/coll_base_allgatherv.lo
  CC       base/coll_base_util.lo
  CC       base/coll_base_allreduce.lo
  CC       base/coll_base_alltoall.lo
  CC       base/coll_base_gather.lo
  CC       base/coll_base_alltoallv.lo
  CC       base/coll_base_reduce.lo
  CC       base/coll_base_barrier.lo
  CC       base/coll_base_reduce_scatter.lo
  CC       base/coll_base_reduce_scatter_block.lo
  CC       base/coll_base_exscan.lo
  CC       base/coll_base_scan.lo
  CCLD     libmca_coll.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/ompi/mca/coll'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../../../openmpi/ompi/mca/coll/base/help-mca-coll-base.txt '/opt/openmpi/share/openmpi'
make[3]: Leaving directory '/scratch/build/ompi/mca/coll'
make[2]: Leaving directory '/scratch/build/ompi/mca/coll'
Making install in mca/crcp
make[2]: Entering directory '/scratch/build/ompi/mca/crcp'
  GENERATE ompi_crcp.7
  CC       base/crcp_base_frame.lo
  CC       base/crcp_base_select.lo
  CC       base/crcp_base_fns.lo
  CCLD     libmca_crcp.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/ompi/mca/crcp'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/man/man7'
 /usr/bin/install -c -m 644 ompi_crcp.7 '/opt/openmpi/share/man/man7'
make[3]: Leaving directory '/scratch/build/ompi/mca/crcp'
make[2]: Leaving directory '/scratch/build/ompi/mca/crcp'
Making install in mca/fbtl
make[2]: Entering directory '/scratch/build/ompi/mca/fbtl'
  CC       base/fbtl_base_file_select.lo
  CC       base/fbtl_base_frame.lo
  CC       base/fbtl_base_file_unselect.lo
  CC       base/fbtl_base_find_available.lo
Get:21 https://apt.repos.intel.com/oneapi all/main all intel-oneapi-dpcpp-debugger-eclipse-cfg all 2023.1.0-43513 [2004 B]
Get:22 https://apt.repos.intel.com/oneapi all/main amd64 intel-oneapi-dpcpp-debugger-2023.0.0 amd64 2023.0.0-25336 [196 MB]
  CCLD     libmca_fbtl.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/ompi/mca/fbtl'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Leaving directory '/scratch/build/ompi/mca/fbtl'
make[2]: Leaving directory '/scratch/build/ompi/mca/fbtl'
Making install in mca/fcoll
make[2]: Entering directory '/scratch/build/ompi/mca/fcoll'
  CC       base/fcoll_base_frame.lo
  CC       base/fcoll_base_file_select.lo
  CC       base/fcoll_base_file_unselect.lo
  CC       base/fcoll_base_find_available.lo
  CC       base/fcoll_base_sort.lo
  CC       base/fcoll_base_coll_array.lo
  CCLD     libmca_fcoll.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/ompi/mca/fcoll'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Leaving directory '/scratch/build/ompi/mca/fcoll'
make[2]: Leaving directory '/scratch/build/ompi/mca/fcoll'
Making install in mca/fs
make[2]: Entering directory '/scratch/build/ompi/mca/fs'
  CC       base/fs_base_frame.lo
  CC       base/fs_base_file_select.lo
  CC       base/fs_base_find_available.lo
  CC       base/fs_base_file_unselect.lo
  CC       base/fs_base_get_parent_dir.lo
  CC       base/fs_base_file_close.lo
  CC       base/fs_base_file_sync.lo
  CC       base/fs_base_file_delete.lo
  CC       base/fs_base_file_set_size.lo
  CC       base/fs_base_file_get_size.lo
  CCLD     libmca_fs.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/ompi/mca/fs'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Leaving directory '/scratch/build/ompi/mca/fs'
make[2]: Leaving directory '/scratch/build/ompi/mca/fs'
Making install in mca/hook
make[2]: Entering directory '/scratch/build/ompi/mca/hook'
  CC       base/hook_base.lo
  CCLD     libmca_hook.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/ompi/mca/hook'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../../../openmpi/ompi/mca/hook/base/help-mca-hook-base.txt '/opt/openmpi/share/openmpi'
make[3]: Leaving directory '/scratch/build/ompi/mca/hook'
make[2]: Leaving directory '/scratch/build/ompi/mca/hook'
Making install in mca/io
make[2]: Entering directory '/scratch/build/ompi/mca/io'
  CC       base/io_base_frame.lo
  CC       base/io_base_delete.lo
  CC       base/io_base_find_available.lo
  CC       base/io_base_file_select.lo
  CC       base/io_base_request.lo
  CC       base/io_base_register_datarep.lo
  CCLD     libmca_io.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/ompi/mca/io'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Leaving directory '/scratch/build/ompi/mca/io'
make[2]: Leaving directory '/scratch/build/ompi/mca/io'
Making install in mca/mtl
make[2]: Entering directory '/scratch/build/ompi/mca/mtl'
  CC       base/mtl_base_frame.lo
  CCLD     libmca_mtl.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/ompi/mca/mtl'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Leaving directory '/scratch/build/ompi/mca/mtl'
make[2]: Leaving directory '/scratch/build/ompi/mca/mtl'
Making install in mca/op
make[2]: Entering directory '/scratch/build/ompi/mca/op'
  CC       base/op_base_frame.lo
  CC       base/op_base_find_available.lo
  CC       base/op_base_functions.lo
  CC       base/op_base_op_select.lo
  CCLD     libmca_op.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/ompi/mca/op'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Leaving directory '/scratch/build/ompi/mca/op'
make[2]: Leaving directory '/scratch/build/ompi/mca/op'
Making install in mca/osc
make[2]: Entering directory '/scratch/build/ompi/mca/osc'
  CC       base/osc_base_frame.lo
  CC       base/osc_base_init.lo
  CC       base/osc_base_obj_convert.lo
  CCLD     libmca_osc.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/ompi/mca/osc'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Leaving directory '/scratch/build/ompi/mca/osc'
make[2]: Leaving directory '/scratch/build/ompi/mca/osc'
Making install in mca/pml
make[2]: Entering directory '/scratch/build/ompi/mca/pml'
  CC       base/pml_base_bsend.lo
  CC       base/pml_base_frame.lo
  CC       base/pml_base_recvreq.lo
  CC       base/pml_base_request.lo
  CC       base/pml_base_select.lo
  CC       base/pml_base_sendreq.lo
  CC       base/pml_base_ft.lo
  CCLD     libmca_pml.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/ompi/mca/pml'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Leaving directory '/scratch/build/ompi/mca/pml'
make[2]: Leaving directory '/scratch/build/ompi/mca/pml'
Making install in mca/rte
make[2]: Entering directory '/scratch/build/ompi/mca/rte'
  CC       base/rte_base_frame.lo
  CCLD     libmca_rte.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/ompi/mca/rte'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Leaving directory '/scratch/build/ompi/mca/rte'
make[2]: Leaving directory '/scratch/build/ompi/mca/rte'
Making install in mca/sharedfp
make[2]: Entering directory '/scratch/build/ompi/mca/sharedfp'
  CC       base/sharedfp_base_file_select.lo
  CC       base/sharedfp_base_file_unselect.lo
  CC       base/sharedfp_base_frame.lo
  CC       base/sharedfp_base_find_available.lo
  CCLD     libmca_sharedfp.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/ompi/mca/sharedfp'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Leaving directory '/scratch/build/ompi/mca/sharedfp'
make[2]: Leaving directory '/scratch/build/ompi/mca/sharedfp'
Making install in mca/topo
make[2]: Entering directory '/scratch/build/ompi/mca/topo'
  CC       base/topo_base_cart_coords.lo
  CC       base/topo_base_cart_create.lo
  CC       base/topo_base_cart_get.lo
  CC       base/topo_base_cart_rank.lo
  CC       base/topo_base_cart_map.lo
  CC       base/topo_base_cart_shift.lo
  CC       base/topo_base_cart_sub.lo
  CC       base/topo_base_cartdim_get.lo
  CC       base/topo_base_comm_select.lo
  CC       base/topo_base_dist_graph_create.lo
  CC       base/topo_base_dist_graph_create_adjacent.lo
  CC       base/topo_base_dist_graph_neighbors.lo
  CC       base/topo_base_dist_graph_neighbors_count.lo
  CC       base/topo_base_find_available.lo
  CC       base/topo_base_frame.lo
  CC       base/topo_base_graph_create.lo
  CC       base/topo_base_graph_get.lo
  CC       base/topo_base_graph_map.lo
  CC       base/topo_base_graph_neighbors.lo
  CC       base/topo_base_graph_neighbors_count.lo
  CC       base/topo_base_graphdims_get.lo
  CC       base/topo_base_lazy_init.lo
  CCLD     libmca_topo.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/ompi/mca/topo'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Leaving directory '/scratch/build/ompi/mca/topo'
make[2]: Leaving directory '/scratch/build/ompi/mca/topo'
Making install in mca/vprotocol
make[2]: Entering directory '/scratch/build/ompi/mca/vprotocol'
  CC       base/vprotocol_base.lo
  CC       base/vprotocol_base_select.lo
  CC       base/vprotocol_base_request.lo
  CC       base/vprotocol_base_parasite.lo
  CCLD     libmca_vprotocol.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/ompi/mca/vprotocol'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Leaving directory '/scratch/build/ompi/mca/vprotocol'
make[2]: Leaving directory '/scratch/build/ompi/mca/vprotocol'
Making install in mca/pml/v
make[2]: Entering directory '/scratch/build/ompi/mca/pml/v'
  CC       pml_v_component.lo
  CC       pml_v_output.lo
  CCLD     libmca_pml_v.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/ompi/mca/pml/v'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Leaving directory '/scratch/build/ompi/mca/pml/v'
make[2]: Leaving directory '/scratch/build/ompi/mca/pml/v'
Making install in mca/rte/orte
make[2]: Entering directory '/scratch/build/ompi/mca/rte/orte'
cp -f ../../../../orte/tools/orterun/orterun.1 mpirun.1
cp -f ../../../../orte/tools/orterun/orterun.1 mpiexec.1
  CC       rte_orte_component.lo
  CC       rte_orte_module.lo
cp -f ../../../../orte/tools/orte-clean/orte-clean.1 ompi-clean.1
cp -f ../../../../orte/tools/orte-server/orte-server.1 ompi-server.1
  CCLD     libmca_rte_orte.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[3]: Entering directory '/scratch/build/ompi/mca/rte/orte'
make  install-exec-hook
 /usr/bin/mkdir -p '/opt/openmpi/share/man/man1'
 /usr/bin/install -c -m 644 mpirun.1 mpiexec.1 ompi-clean.1 ompi-server.1 '/opt/openmpi/share/man/man1'
make[4]: Entering directory '/scratch/build/ompi/mca/rte/orte'
(cd /opt/openmpi/bin; rm -f mpirun; ln -s orterun mpirun)
(cd /opt/openmpi/bin; rm -f mpiexec; ln -s orterun mpiexec)
(cd /opt/openmpi/bin; rm -f ompi-clean; ln -s orte-clean ompi-clean)
(cd /opt/openmpi/bin; rm -f ompi-server; ln -s orte-server ompi-server)
make[4]: Leaving directory '/scratch/build/ompi/mca/rte/orte'
make[3]: Leaving directory '/scratch/build/ompi/mca/rte/orte'
make[2]: Leaving directory '/scratch/build/ompi/mca/rte/orte'
Making install in .
make[2]: Entering directory '/scratch/build/ompi'
/usr/bin/mkdir -p `dirname ../ompi/mpi/man/man3/.dir-stamp`
  CC       runtime/libompi_mpir_la-ompi_mpi_init.lo
touch "../ompi/mpi/man/man3/.dir-stamp"
  CC       class/ompi_seq_tracker.lo
  CC       attribute/attribute.lo
  CC       attribute/attribute_predefined.lo
Get:23 https://apt.repos.intel.com/oneapi all/main amd64 intel-oneapi-compiler-shared-2023.0.0 amd64 2023.0.0-25370 [16.6 MB]
Get:24 https://apt.repos.intel.com/oneapi all/main all intel-oneapi-tbb-common-devel-2021.8.0 all 2021.8.0-25334 [182 kB]
Get:25 https://apt.repos.intel.com/oneapi all/main amd64 intel-oneapi-tbb-devel-2021.8.0 amd64 2021.8.0-25334 [890 kB]
Get:26 https://apt.repos.intel.com/oneapi all/main all intel-oneapi-dev-utilities-eclipse-cfg all 2021.10.0-49423 [1976 B]
Get:27 https://apt.repos.intel.com/oneapi all/main amd64 intel-oneapi-dev-utilities-2021.8.0 amd64 2021.8.0-25328 [11.4 MB]
  CC       communicator/comm_init.lo
  CC       communicator/comm.lo
  CC       communicator/comm_cid.lo
  CC       communicator/comm_request.lo
Get:28 https://apt.repos.intel.com/oneapi all/main amd64 intel-oneapi-dpcpp-cpp-2023.0.0 amd64 2023.0.0-25370 [488 MB]
  CC       dpm/dpm.lo
  CC       errhandler/errhandler.lo
  CC       errhandler/errhandler_invoke.lo
  CC       errhandler/errhandler_predefined.lo
  CC       errhandler/errcode.lo
  CC       errhandler/errcode-internal.lo
  CC       file/file.lo
  CC       group/group.lo
  CC       group/group_init.lo
  CC       group/group_set_rank.lo
  CC       group/group_plist.lo
  CC       group/group_sporadic.lo
  CC       group/group_strided.lo
  CC       group/group_bitmap.lo
  CC       info/info.lo
  CC       interlib/interlib.lo
  CC       message/message.lo
  CC       op/op.lo
  CC       proc/proc.lo
  CC       request/grequest.lo
  CC       request/request.lo
  CC       request/req_test.lo
  CC       request/req_wait.lo
  CC       runtime/ompi_mpi_abort.lo
  CC       runtime/ompi_mpi_dynamics.lo
  CC       runtime/ompi_mpi_finalize.lo
  CC       runtime/ompi_mpi_params.lo
  CC       runtime/ompi_mpi_preconnect.lo
  CC       runtime/ompi_cr.lo
  CC       runtime/ompi_info_support.lo
  CC       runtime/ompi_spc.lo
  CC       win/win.lo
  CC       mpiext/mpiext.lo
  CC       patterns/net/netpatterns_base.lo
  CC       patterns/net/netpatterns_multinomial_tree.lo
  CC       patterns/net/netpatterns_nary_tree.lo
  CC       patterns/net/netpatterns_knomial_tree.lo
  CC       patterns/comm/allreduce.lo
  CC       patterns/comm/allgather.lo
  CC       patterns/comm/bcast.lo
  CCLD     libompi_mpir.la
  GENERATE mpi/man/man3/MPI.3
  GENERATE mpi/man/man3/MPI_Abort.3
  GENERATE mpi/man/man3/MPI_Accumulate.3
  GENERATE mpi/man/man3/MPI_Add_error_class.3
ar: `u' modifier ignored since `D' is the default (see `U')
  GENERATE mpi/man/man3/MPI_Add_error_code.3
  GENERATE mpi/man/man3/MPI_Add_error_string.3
  GENERATE mpi/man/man3/MPI_Address.3
  GENERATE mpi/man/man3/MPI_Aint_add.3
  GENERATE mpi/man/man3/MPI_Aint_diff.3
  GENERATE mpi/man/man3/MPI_Allgather.3
  GENERATE mpi/man/man3/MPI_Iallgather.3
  GENERATE mpi/man/man3/MPI_Allgatherv.3
  GENERATE mpi/man/man3/MPI_Iallgatherv.3
  GENERATE mpi/man/man3/MPI_Alloc_mem.3
  GENERATE mpi/man/man3/MPI_Allreduce.3
  GENERATE mpi/man/man3/MPI_Iallreduce.3
  GENERATE mpi/man/man3/MPI_Alltoall.3
  GENERATE mpi/man/man3/MPI_Ialltoall.3
  GENERATE mpi/man/man3/MPI_Alltoallv.3
  GENERATE mpi/man/man3/MPI_Ialltoallv.3
  GENERATE mpi/man/man3/MPI_Alltoallw.3
  GENERATE mpi/man/man3/MPI_Ialltoallw.3
  GENERATE mpi/man/man3/MPI_Attr_delete.3
  GENERATE mpi/man/man3/MPI_Attr_get.3
  GENERATE mpi/man/man3/MPI_Attr_put.3
  GENERATE mpi/man/man3/MPI_Barrier.3
  GENERATE mpi/man/man3/MPI_Ibarrier.3
  GENERATE mpi/man/man3/MPI_Bcast.3
  GENERATE mpi/man/man3/MPI_Ibcast.3
  GENERATE mpi/man/man3/MPI_Bsend.3
  GENERATE mpi/man/man3/MPI_Bsend_init.3
  GENERATE mpi/man/man3/MPI_Buffer_attach.3
  GENERATE mpi/man/man3/MPI_Buffer_detach.3
  GENERATE mpi/man/man3/MPI_Cancel.3
  GENERATE mpi/man/man3/MPI_Cart_coords.3
  GENERATE mpi/man/man3/MPI_Cart_create.3
  GENERATE mpi/man/man3/MPI_Cartdim_get.3
  GENERATE mpi/man/man3/MPI_Cart_get.3
  GENERATE mpi/man/man3/MPI_Cart_map.3
  GENERATE mpi/man/man3/MPI_Cart_rank.3
  GENERATE mpi/man/man3/MPI_Cart_shift.3
  GENERATE mpi/man/man3/MPI_Cart_sub.3
  GENERATE mpi/man/man3/MPI_Close_port.3
  GENERATE mpi/man/man3/MPI_Comm_accept.3
  GENERATE mpi/man/man3/MPI_Comm_c2f.3
  GENERATE mpi/man/man3/MPI_Comm_call_errhandler.3
  GENERATE mpi/man/man3/MPI_Comm_compare.3
  GENERATE mpi/man/man3/MPI_Comm_connect.3
  GENERATE mpi/man/man3/MPI_Comm_create.3
  GENERATE mpi/man/man3/MPI_Comm_create_group.3
  GENERATE mpi/man/man3/MPI_Comm_create_errhandler.3
  GENERATE mpi/man/man3/MPI_Comm_create_keyval.3
  GENERATE mpi/man/man3/MPI_Comm_disconnect.3
  GENERATE mpi/man/man3/MPI_Comm_delete_attr.3
  GENERATE mpi/man/man3/MPI_Comm_dup.3
  GENERATE mpi/man/man3/MPI_Comm_dup_with_info.3
  GENERATE mpi/man/man3/MPI_Comm_idup.3
  GENERATE mpi/man/man3/MPI_Comm_f2c.3
  GENERATE mpi/man/man3/MPI_Comm_free.3
  GENERATE mpi/man/man3/MPI_Comm_free_keyval.3
  GENERATE mpi/man/man3/MPI_Comm_get_attr.3
  GENERATE mpi/man/man3/MPI_Comm_get_errhandler.3
  GENERATE mpi/man/man3/MPI_Comm_get_info.3
  GENERATE mpi/man/man3/MPI_Comm_get_name.3
  GENERATE mpi/man/man3/MPI_Comm_get_parent.3
  GENERATE mpi/man/man3/MPI_Comm_group.3
  GENERATE mpi/man/man3/MPI_Comm_join.3
  GENERATE mpi/man/man3/MPI_Comm_rank.3
  GENERATE mpi/man/man3/MPI_Comm_remote_group.3
  GENERATE mpi/man/man3/MPI_Comm_remote_size.3
  GENERATE mpi/man/man3/MPI_Comm_set_attr.3
  GENERATE mpi/man/man3/MPI_Comm_set_errhandler.3
  GENERATE mpi/man/man3/MPI_Comm_set_info.3
  GENERATE mpi/man/man3/MPI_Comm_set_name.3
  GENERATE mpi/man/man3/MPI_Comm_size.3
  GENERATE mpi/man/man3/MPI_Comm_spawn.3
  GENERATE mpi/man/man3/MPI_Comm_spawn_multiple.3
  GENERATE mpi/man/man3/MPI_Comm_split.3
  GENERATE mpi/man/man3/MPI_Comm_split_type.3
  GENERATE mpi/man/man3/MPI_Comm_test_inter.3
  GENERATE mpi/man/man3/MPI_Compare_and_swap.3
  GENERATE mpi/man/man3/MPI_Dims_create.3
  GENERATE mpi/man/man3/MPI_Dist_graph_create.3
  GENERATE mpi/man/man3/MPI_Dist_graph_create_adjacent.3
  GENERATE mpi/man/man3/MPI_Dist_graph_neighbors.3
  GENERATE mpi/man/man3/MPI_Dist_graph_neighbors_count.3
  GENERATE mpi/man/man3/MPI_Errhandler_create.3
  GENERATE mpi/man/man3/MPI_Errhandler_free.3
  GENERATE mpi/man/man3/MPI_Errhandler_get.3
  GENERATE mpi/man/man3/MPI_Errhandler_set.3
  GENERATE mpi/man/man3/MPI_Error_class.3
  GENERATE mpi/man/man3/MPI_Error_string.3
  GENERATE mpi/man/man3/MPI_Exscan.3
  GENERATE mpi/man/man3/MPI_Iexscan.3
  GENERATE mpi/man/man3/MPI_Fetch_and_op.3
  GENERATE mpi/man/man3/MPI_File_c2f.3
  GENERATE mpi/man/man3/MPI_File_call_errhandler.3
  GENERATE mpi/man/man3/MPI_File_close.3
  GENERATE mpi/man/man3/MPI_File_create_errhandler.3
  GENERATE mpi/man/man3/MPI_File_delete.3
  GENERATE mpi/man/man3/MPI_File_f2c.3
  GENERATE mpi/man/man3/MPI_File_get_amode.3
  GENERATE mpi/man/man3/MPI_File_get_atomicity.3
  GENERATE mpi/man/man3/MPI_File_get_byte_offset.3
  GENERATE mpi/man/man3/MPI_File_get_errhandler.3
  GENERATE mpi/man/man3/MPI_File_get_group.3
  GENERATE mpi/man/man3/MPI_File_get_info.3
  GENERATE mpi/man/man3/MPI_File_get_position.3
  GENERATE mpi/man/man3/MPI_File_get_size.3
  GENERATE mpi/man/man3/MPI_File_get_position_shared.3
  GENERATE mpi/man/man3/MPI_File_get_type_extent.3
  GENERATE mpi/man/man3/MPI_File_iread.3
  GENERATE mpi/man/man3/MPI_File_get_view.3
  GENERATE mpi/man/man3/MPI_File_iread_at.3
  GENERATE mpi/man/man3/MPI_File_iread_all.3
  GENERATE mpi/man/man3/MPI_File_iread_at_all.3
  GENERATE mpi/man/man3/MPI_File_iread_shared.3
  GENERATE mpi/man/man3/MPI_File_iwrite.3
  GENERATE mpi/man/man3/MPI_File_iwrite_at.3
  GENERATE mpi/man/man3/MPI_File_iwrite_all.3
  GENERATE mpi/man/man3/MPI_File_iwrite_at_all.3
  GENERATE mpi/man/man3/MPI_File_iwrite_shared.3
  GENERATE mpi/man/man3/MPI_File_open.3
  GENERATE mpi/man/man3/MPI_File_preallocate.3
  GENERATE mpi/man/man3/MPI_File_read.3
  GENERATE mpi/man/man3/MPI_File_read_all.3
  GENERATE mpi/man/man3/MPI_File_read_all_begin.3
  GENERATE mpi/man/man3/MPI_File_read_all_end.3
  GENERATE mpi/man/man3/MPI_File_read_at.3
  GENERATE mpi/man/man3/MPI_File_read_at_all.3
  GENERATE mpi/man/man3/MPI_File_read_at_all_begin.3
  GENERATE mpi/man/man3/MPI_File_read_at_all_end.3
  GENERATE mpi/man/man3/MPI_File_read_ordered.3
  GENERATE mpi/man/man3/MPI_File_read_ordered_begin.3
  GENERATE mpi/man/man3/MPI_File_read_ordered_end.3
  GENERATE mpi/man/man3/MPI_File_seek.3
  GENERATE mpi/man/man3/MPI_File_read_shared.3
  GENERATE mpi/man/man3/MPI_File_seek_shared.3
  GENERATE mpi/man/man3/MPI_File_set_atomicity.3
  GENERATE mpi/man/man3/MPI_File_set_errhandler.3
  GENERATE mpi/man/man3/MPI_File_set_info.3
  GENERATE mpi/man/man3/MPI_File_set_size.3
  GENERATE mpi/man/man3/MPI_File_set_view.3
  GENERATE mpi/man/man3/MPI_File_sync.3
  GENERATE mpi/man/man3/MPI_File_write.3
  GENERATE mpi/man/man3/MPI_File_write_all.3
  GENERATE mpi/man/man3/MPI_File_write_all_begin.3
  GENERATE mpi/man/man3/MPI_File_write_all_end.3
  GENERATE mpi/man/man3/MPI_File_write_at.3
  GENERATE mpi/man/man3/MPI_File_write_at_all.3
  GENERATE mpi/man/man3/MPI_File_write_at_all_begin.3
  GENERATE mpi/man/man3/MPI_File_write_at_all_end.3
  GENERATE mpi/man/man3/MPI_File_write_ordered.3
  GENERATE mpi/man/man3/MPI_File_write_ordered_begin.3
  GENERATE mpi/man/man3/MPI_File_write_ordered_end.3
  GENERATE mpi/man/man3/MPI_File_write_shared.3
  GENERATE mpi/man/man3/MPI_Finalize.3
  GENERATE mpi/man/man3/MPI_Finalized.3
  GENERATE mpi/man/man3/MPI_Free_mem.3
  GENERATE mpi/man/man3/MPI_Gather.3
  GENERATE mpi/man/man3/MPI_Igather.3
  GENERATE mpi/man/man3/MPI_Gatherv.3
  GENERATE mpi/man/man3/MPI_Igatherv.3
  GENERATE mpi/man/man3/MPI_Get.3
  GENERATE mpi/man/man3/MPI_Get_accumulate.3
  GENERATE mpi/man/man3/MPI_Get_address.3
  GENERATE mpi/man/man3/MPI_Get_count.3
  GENERATE mpi/man/man3/MPI_Get_elements.3
  GENERATE mpi/man/man3/MPI_Get_elements_x.3
  GENERATE mpi/man/man3/MPI_Get_library_version.3
  GENERATE mpi/man/man3/MPI_Get_processor_name.3
  GENERATE mpi/man/man3/MPI_Get_version.3
  GENERATE mpi/man/man3/MPI_Graph_create.3
  GENERATE mpi/man/man3/MPI_Graphdims_get.3
  GENERATE mpi/man/man3/MPI_Graph_get.3
  GENERATE mpi/man/man3/MPI_Graph_map.3
  GENERATE mpi/man/man3/MPI_Graph_neighbors.3
  GENERATE mpi/man/man3/MPI_Graph_neighbors_count.3
  GENERATE mpi/man/man3/MPI_Grequest_complete.3
  GENERATE mpi/man/man3/MPI_Grequest_start.3
  GENERATE mpi/man/man3/MPI_Group_c2f.3
  GENERATE mpi/man/man3/MPI_Group_compare.3
  GENERATE mpi/man/man3/MPI_Group_difference.3
  GENERATE mpi/man/man3/MPI_Group_excl.3
  GENERATE mpi/man/man3/MPI_Group_f2c.3
  GENERATE mpi/man/man3/MPI_Group_free.3
  GENERATE mpi/man/man3/MPI_Group_incl.3
  GENERATE mpi/man/man3/MPI_Group_intersection.3
  GENERATE mpi/man/man3/MPI_Group_range_excl.3
  GENERATE mpi/man/man3/MPI_Group_range_incl.3
  GENERATE mpi/man/man3/MPI_Group_rank.3
  GENERATE mpi/man/man3/MPI_Group_size.3
  GENERATE mpi/man/man3/MPI_Group_translate_ranks.3
  GENERATE mpi/man/man3/MPI_Group_union.3
  GENERATE mpi/man/man3/MPI_Ibsend.3
  GENERATE mpi/man/man3/MPI_Improbe.3
  GENERATE mpi/man/man3/MPI_Imrecv.3
  GENERATE mpi/man/man3/MPI_Info_c2f.3
  GENERATE mpi/man/man3/MPI_Info_create.3
  GENERATE mpi/man/man3/MPI_Info_delete.3
  GENERATE mpi/man/man3/MPI_Info_dup.3
  GENERATE mpi/man/man3/MPI_Info_env.3
  GENERATE mpi/man/man3/MPI_Info_f2c.3
  GENERATE mpi/man/man3/MPI_Info_free.3
  GENERATE mpi/man/man3/MPI_Info_get.3
  GENERATE mpi/man/man3/MPI_Info_get_nkeys.3
  GENERATE mpi/man/man3/MPI_Info_get_nthkey.3
  GENERATE mpi/man/man3/MPI_Info_get_valuelen.3
  GENERATE mpi/man/man3/MPI_Info_set.3
  GENERATE mpi/man/man3/MPI_Init.3
  GENERATE mpi/man/man3/MPI_Initialized.3
  GENERATE mpi/man/man3/MPI_Init_thread.3
  GENERATE mpi/man/man3/MPI_Intercomm_create.3
  GENERATE mpi/man/man3/MPI_Intercomm_merge.3
  GENERATE mpi/man/man3/MPI_Iprobe.3
  GENERATE mpi/man/man3/MPI_Irecv.3
  GENERATE mpi/man/man3/MPI_Irsend.3
  GENERATE mpi/man/man3/MPI_Isend.3
  GENERATE mpi/man/man3/MPI_Issend.3
  GENERATE mpi/man/man3/MPI_Is_thread_main.3
  GENERATE mpi/man/man3/MPI_Keyval_create.3
  GENERATE mpi/man/man3/MPI_Keyval_free.3
  GENERATE mpi/man/man3/MPI_Lookup_name.3
  GENERATE mpi/man/man3/MPI_Message_c2f.3
  GENERATE mpi/man/man3/MPI_Message_f2c.3
  GENERATE mpi/man/man3/MPI_Mprobe.3
  GENERATE mpi/man/man3/MPI_Mrecv.3
  GENERATE mpi/man/man3/MPI_Neighbor_allgather.3
  GENERATE mpi/man/man3/MPI_Ineighbor_allgather.3
  GENERATE mpi/man/man3/MPI_Neighbor_allgatherv.3
  GENERATE mpi/man/man3/MPI_Ineighbor_allgatherv.3
  GENERATE mpi/man/man3/MPI_Neighbor_alltoall.3
  GENERATE mpi/man/man3/MPI_Ineighbor_alltoall.3
  GENERATE mpi/man/man3/MPI_Neighbor_alltoallv.3
  GENERATE mpi/man/man3/MPI_Ineighbor_alltoallv.3
  GENERATE mpi/man/man3/MPI_Neighbor_alltoallw.3
  GENERATE mpi/man/man3/MPI_Ineighbor_alltoallw.3
  GENERATE mpi/man/man3/MPI_Op_c2f.3
  GENERATE mpi/man/man3/MPI_Op_commutative.3
  GENERATE mpi/man/man3/MPI_Op_create.3
  GENERATE mpi/man/man3/MPI_Open_port.3
  GENERATE mpi/man/man3/MPI_Op_f2c.3
  GENERATE mpi/man/man3/MPI_Op_free.3
  GENERATE mpi/man/man3/MPI_Pack.3
  GENERATE mpi/man/man3/MPI_Pack_external.3
  GENERATE mpi/man/man3/MPI_Pack_external_size.3
  GENERATE mpi/man/man3/MPI_Pack_size.3
  GENERATE mpi/man/man3/MPI_Pcontrol.3
  GENERATE mpi/man/man3/MPI_Probe.3
  GENERATE mpi/man/man3/MPI_Publish_name.3
  GENERATE mpi/man/man3/MPI_Put.3
  GENERATE mpi/man/man3/MPI_Query_thread.3
  GENERATE mpi/man/man3/MPI_Raccumulate.3
  GENERATE mpi/man/man3/MPI_Recv.3
  GENERATE mpi/man/man3/MPI_Recv_init.3
  GENERATE mpi/man/man3/MPI_Reduce.3
  GENERATE mpi/man/man3/MPI_Ireduce.3
  GENERATE mpi/man/man3/MPI_Reduce_local.3
  GENERATE mpi/man/man3/MPI_Reduce_scatter.3
  GENERATE mpi/man/man3/MPI_Ireduce_scatter.3
  GENERATE mpi/man/man3/MPI_Reduce_scatter_block.3
  GENERATE mpi/man/man3/MPI_Ireduce_scatter_block.3
  GENERATE mpi/man/man3/MPI_Register_datarep.3
  GENERATE mpi/man/man3/MPI_Request_c2f.3
  GENERATE mpi/man/man3/MPI_Request_f2c.3
  GENERATE mpi/man/man3/MPI_Request_free.3
  GENERATE mpi/man/man3/MPI_Request_get_status.3
  GENERATE mpi/man/man3/MPI_Rget.3
  GENERATE mpi/man/man3/MPI_Rget_accumulate.3
  GENERATE mpi/man/man3/MPI_Rput.3
  GENERATE mpi/man/man3/MPI_Rsend.3
  GENERATE mpi/man/man3/MPI_Rsend_init.3
  GENERATE mpi/man/man3/MPI_Scan.3
  GENERATE mpi/man/man3/MPI_Iscan.3
  GENERATE mpi/man/man3/MPI_Scatter.3
  GENERATE mpi/man/man3/MPI_Iscatter.3
  GENERATE mpi/man/man3/MPI_Scatterv.3
  GENERATE mpi/man/man3/MPI_Iscatterv.3
  GENERATE mpi/man/man3/MPI_Send.3
  GENERATE mpi/man/man3/MPI_Send_init.3
  GENERATE mpi/man/man3/MPI_Sendrecv.3
  GENERATE mpi/man/man3/MPI_Sendrecv_replace.3
  GENERATE mpi/man/man3/MPI_Sizeof.3
  GENERATE mpi/man/man3/MPI_Ssend.3
  GENERATE mpi/man/man3/MPI_Ssend_init.3
  GENERATE mpi/man/man3/MPI_Start.3
  GENERATE mpi/man/man3/MPI_Startall.3
  GENERATE mpi/man/man3/MPI_Status_c2f.3
  GENERATE mpi/man/man3/MPI_Status_f2c.3
  GENERATE mpi/man/man3/MPI_Status_set_cancelled.3
  GENERATE mpi/man/man3/MPI_Status_set_elements.3
  GENERATE mpi/man/man3/MPI_Status_set_elements_x.3
  GENERATE mpi/man/man3/MPI_T_category_changed.3
  GENERATE mpi/man/man3/MPI_T_category_get_categories.3
  GENERATE mpi/man/man3/MPI_T_category_get_info.3
  GENERATE mpi/man/man3/MPI_T_category_get_cvars.3
  GENERATE mpi/man/man3/MPI_T_category_get_num.3
  GENERATE mpi/man/man3/MPI_T_category_get_pvars.3
  GENERATE mpi/man/man3/MPI_T_cvar_get_info.3
  GENERATE mpi/man/man3/MPI_T_cvar_get_num.3
  GENERATE mpi/man/man3/MPI_T_cvar_handle_alloc.3
  GENERATE mpi/man/man3/MPI_T_cvar_handle_free.3
  GENERATE mpi/man/man3/MPI_T_cvar_read.3
  GENERATE mpi/man/man3/MPI_T_cvar_write.3
  GENERATE mpi/man/man3/MPI_T_enum_get_info.3
  GENERATE mpi/man/man3/MPI_T_enum_get_item.3
  GENERATE mpi/man/man3/MPI_T_finalize.3
  GENERATE mpi/man/man3/MPI_T_init_thread.3
  GENERATE mpi/man/man3/MPI_T_pvar_get_info.3
  GENERATE mpi/man/man3/MPI_T_pvar_get_num.3
  GENERATE mpi/man/man3/MPI_T_pvar_handle_alloc.3
  GENERATE mpi/man/man3/MPI_T_pvar_handle_free.3
  GENERATE mpi/man/man3/MPI_T_pvar_read.3
  GENERATE mpi/man/man3/MPI_T_pvar_readreset.3
  GENERATE mpi/man/man3/MPI_T_pvar_reset.3
  GENERATE mpi/man/man3/MPI_T_pvar_session_create.3
  GENERATE mpi/man/man3/MPI_T_pvar_session_free.3
  GENERATE mpi/man/man3/MPI_T_pvar_start.3
  GENERATE mpi/man/man3/MPI_T_pvar_stop.3
  GENERATE mpi/man/man3/MPI_T_pvar_write.3
  GENERATE mpi/man/man3/MPI_Test.3
  GENERATE mpi/man/man3/MPI_Testall.3
  GENERATE mpi/man/man3/MPI_Testany.3
  GENERATE mpi/man/man3/MPI_Test_cancelled.3
  GENERATE mpi/man/man3/MPI_Testsome.3
  GENERATE mpi/man/man3/MPI_Topo_test.3
  GENERATE mpi/man/man3/MPI_Type_c2f.3
  GENERATE mpi/man/man3/MPI_Type_commit.3
  GENERATE mpi/man/man3/MPI_Type_contiguous.3
  GENERATE mpi/man/man3/MPI_Type_create_darray.3
  GENERATE mpi/man/man3/MPI_Type_create_f90_complex.3
  GENERATE mpi/man/man3/MPI_Type_create_f90_integer.3
  GENERATE mpi/man/man3/MPI_Type_create_f90_real.3
  GENERATE mpi/man/man3/MPI_Type_create_hindexed.3
  GENERATE mpi/man/man3/MPI_Type_create_hindexed_block.3
  GENERATE mpi/man/man3/MPI_Type_create_hvector.3
  GENERATE mpi/man/man3/MPI_Type_create_indexed_block.3
  GENERATE mpi/man/man3/MPI_Type_create_keyval.3
  GENERATE mpi/man/man3/MPI_Type_create_resized.3
  GENERATE mpi/man/man3/MPI_Type_create_struct.3
  GENERATE mpi/man/man3/MPI_Type_create_subarray.3
  GENERATE mpi/man/man3/MPI_Type_delete_attr.3
  GENERATE mpi/man/man3/MPI_Type_extent.3
  GENERATE mpi/man/man3/MPI_Type_dup.3
  GENERATE mpi/man/man3/MPI_Type_f2c.3
  GENERATE mpi/man/man3/MPI_Type_free.3
  GENERATE mpi/man/man3/MPI_Type_free_keyval.3
  GENERATE mpi/man/man3/MPI_Type_get_attr.3
  GENERATE mpi/man/man3/MPI_Type_get_contents.3
  GENERATE mpi/man/man3/MPI_Type_get_envelope.3
  GENERATE mpi/man/man3/MPI_Type_get_extent.3
  GENERATE mpi/man/man3/MPI_Type_get_extent_x.3
  GENERATE mpi/man/man3/MPI_Type_get_true_extent.3
  GENERATE mpi/man/man3/MPI_Type_get_true_extent_x.3
  GENERATE mpi/man/man3/MPI_Type_get_name.3
  GENERATE mpi/man/man3/MPI_Type_hindexed.3
  GENERATE mpi/man/man3/MPI_Type_hvector.3
  GENERATE mpi/man/man3/MPI_Type_indexed.3
  GENERATE mpi/man/man3/MPI_Type_lb.3
  GENERATE mpi/man/man3/MPI_Type_match_size.3
  GENERATE mpi/man/man3/MPI_Type_set_attr.3
  GENERATE mpi/man/man3/MPI_Type_set_name.3
  GENERATE mpi/man/man3/MPI_Type_size.3
  GENERATE mpi/man/man3/MPI_Type_size_x.3
  GENERATE mpi/man/man3/MPI_Type_vector.3
  GENERATE mpi/man/man3/MPI_Type_struct.3
  GENERATE mpi/man/man3/MPI_Type_ub.3
  GENERATE mpi/man/man3/MPI_Unpack.3
  GENERATE mpi/man/man3/MPI_Unpack_external.3
  GENERATE mpi/man/man3/MPI_Unpublish_name.3
  GENERATE mpi/man/man3/MPI_Wait.3
  GENERATE mpi/man/man3/MPI_Waitall.3
  GENERATE mpi/man/man3/MPI_Waitany.3
  GENERATE mpi/man/man3/MPI_Waitsome.3
  GENERATE mpi/man/man3/MPI_Win_allocate.3
  GENERATE mpi/man/man3/MPI_Win_allocate_shared.3
  GENERATE mpi/man/man3/MPI_Win_attach.3
  GENERATE mpi/man/man3/MPI_Win_c2f.3
  GENERATE mpi/man/man3/MPI_Win_call_errhandler.3
  GENERATE mpi/man/man3/MPI_Win_complete.3
  GENERATE mpi/man/man3/MPI_Win_create.3
  GENERATE mpi/man/man3/MPI_Win_create_dynamic.3
  GENERATE mpi/man/man3/MPI_Win_create_errhandler.3
  GENERATE mpi/man/man3/MPI_Win_create_keyval.3
  GENERATE mpi/man/man3/MPI_Win_delete_attr.3
  GENERATE mpi/man/man3/MPI_Win_detach.3
  GENERATE mpi/man/man3/MPI_Win_f2c.3
  GENERATE mpi/man/man3/MPI_Win_fence.3
  GENERATE mpi/man/man3/MPI_Win_flush.3
  GENERATE mpi/man/man3/MPI_Win_flush_all.3
  GENERATE mpi/man/man3/MPI_Win_flush_local.3
  GENERATE mpi/man/man3/MPI_Win_flush_local_all.3
  GENERATE mpi/man/man3/MPI_Win_free.3
  GENERATE mpi/man/man3/MPI_Win_free_keyval.3
  GENERATE mpi/man/man3/MPI_Win_get_attr.3
  GENERATE mpi/man/man3/MPI_Win_get_errhandler.3
  GENERATE mpi/man/man3/MPI_Win_get_group.3
  GENERATE mpi/man/man3/MPI_Win_get_info.3
  GENERATE mpi/man/man3/MPI_Win_get_name.3
  GENERATE mpi/man/man3/MPI_Win_lock.3
  GENERATE mpi/man/man3/MPI_Win_lock_all.3
  GENERATE mpi/man/man3/MPI_Win_post.3
  GENERATE mpi/man/man3/MPI_Win_set_attr.3
  GENERATE mpi/man/man3/MPI_Win_set_errhandler.3
  GENERATE mpi/man/man3/MPI_Win_set_info.3
  GENERATE mpi/man/man3/MPI_Win_set_name.3
  GENERATE mpi/man/man3/MPI_Win_shared_query.3
  GENERATE mpi/man/man3/MPI_Win_start.3
  GENERATE mpi/man/man3/MPI_Win_sync.3
  GENERATE mpi/man/man3/MPI_Win_test.3
  GENERATE mpi/man/man3/MPI_Win_unlock.3
  GENERATE mpi/man/man3/MPI_Win_unlock_all.3
  GENERATE mpi/man/man3/MPI_Win_wait.3
  GENERATE mpi/man/man3/MPI_Wtick.3
  GENERATE mpi/man/man3/MPI_Wtime.3
  GENERATE mpi/man/man3/OpenMPI.3
  CCLD     libmpi.la
make[3]: Entering directory '/scratch/build/ompi'
 /usr/bin/mkdir -p '/opt/openmpi/lib'
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /bin/bash ../libtool   --mode=install /usr/bin/install -c   libmpi.la '/opt/openmpi/lib'
 /usr/bin/install -c -m 644 ../../openmpi/ompi/errhandler/help-mpi-errors.txt ../../openmpi/ompi/runtime/help-mpi-runtime.txt ../../openmpi/ompi/mpi/help-mpi-api.txt '/opt/openmpi/share/openmpi'
 /usr/bin/mkdir -p '/opt/openmpi/share/man/man3'
libtool: warning: relinking 'libmpi.la'
libtool: install: (cd /scratch/build/ompi; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -version-info 60:2:20 -o libmpi.la -rpath /opt/openmpi/lib class/ompi_seq_tracker.lo attribute/attribute.lo attribute/attribute_predefined.lo communicator/comm_init.lo communicator/comm.lo communicator/comm_cid.lo communicator/comm_request.lo dpm/dpm.lo errhandler/errhandler.lo errhandler/errhandler_invoke.lo errhandler/errhandler_predefined.lo errhandler/errcode.lo errhandler/errcode-internal.lo file/file.lo group/group.lo group/group_init.lo group/group_set_rank.lo group/group_plist.lo group/group_sporadic.lo group/group_strided.lo group/group_bitmap.lo info/info.lo interlib/interlib.lo message/message.lo op/op.lo proc/proc.lo request/grequest.lo request/request.lo request/req_test.lo request/req_wait.lo runtime/ompi_mpi_abort.lo runtime/ompi_mpi_dynamics.lo runtime/ompi_mpi_finalize.lo runtime/ompi_mpi_params.lo runtime/ompi_mpi_preconnect.lo runtime/ompi_cr.lo runtime/ompi_info_support.lo runtime/ompi_spc.lo win/win.lo mpiext/mpiext.lo patterns/net/netpatterns_base.lo patterns/net/netpatterns_multinomial_tree.lo patterns/net/netpatterns_nary_tree.lo patterns/net/netpatterns_knomial_tree.lo patterns/comm/allreduce.lo patterns/comm/allgather.lo patterns/comm/bcast.lo datatype/libdatatype.la debuggers/libdebuggers.la mpi/c/libmpi_c.la mpi/tool/libmpi_mpit_common.la mpi/c/profile/libmpi_c_pmpi.la mpi/tool/profile/libmpi_pmpit.la mca/bml/libmca_bml.la mca/coll/libmca_coll.la mca/crcp/libmca_crcp.la mca/fbtl/libmca_fbtl.la mca/fcoll/libmca_fcoll.la mca/fs/libmca_fs.la mca/hook/libmca_hook.la mca/io/libmca_io.la mca/mtl/libmca_mtl.la mca/op/libmca_op.la mca/osc/libmca_osc.la mca/pml/libmca_pml.la mca/pml/v/libmca_pml_v.la mca/rte/libmca_rte.la mca/rte/orte/libmca_rte_orte.la mca/sharedfp/libmca_sharedfp.la mca/topo/libmca_topo.la mca/vprotocol/libmca_vprotocol.la ../ompi/mpiext/affinity/c/libmpiext_affinity_c.la ../ompi/mpiext/cuda/c/libmpiext_cuda_c.la ../ompi/mpiext/pcollreq/c/libmpiext_pcollreq_c.la /scratch/build/orte/libopen-rte.la /scratch/build/opal/libopen-pal.la libompi_mpir.la -lrt -lm -lutil )
 /usr/bin/install -c -m 644 mpi/man/man3/MPI.3 mpi/man/man3/MPI_Abort.3 mpi/man/man3/MPI_Accumulate.3 mpi/man/man3/MPI_Add_error_class.3 mpi/man/man3/MPI_Add_error_code.3 mpi/man/man3/MPI_Add_error_string.3 mpi/man/man3/MPI_Address.3 mpi/man/man3/MPI_Aint_add.3 mpi/man/man3/MPI_Aint_diff.3 mpi/man/man3/MPI_Allgather.3 mpi/man/man3/MPI_Iallgather.3 mpi/man/man3/MPI_Allgatherv.3 mpi/man/man3/MPI_Iallgatherv.3 mpi/man/man3/MPI_Alloc_mem.3 mpi/man/man3/MPI_Allreduce.3 mpi/man/man3/MPI_Iallreduce.3 mpi/man/man3/MPI_Alltoall.3 mpi/man/man3/MPI_Ialltoall.3 mpi/man/man3/MPI_Alltoallv.3 mpi/man/man3/MPI_Ialltoallv.3 mpi/man/man3/MPI_Alltoallw.3 mpi/man/man3/MPI_Ialltoallw.3 mpi/man/man3/MPI_Attr_delete.3 mpi/man/man3/MPI_Attr_get.3 mpi/man/man3/MPI_Attr_put.3 mpi/man/man3/MPI_Barrier.3 mpi/man/man3/MPI_Ibarrier.3 mpi/man/man3/MPI_Bcast.3 mpi/man/man3/MPI_Ibcast.3 mpi/man/man3/MPI_Bsend.3 mpi/man/man3/MPI_Bsend_init.3 mpi/man/man3/MPI_Buffer_attach.3 mpi/man/man3/MPI_Buffer_detach.3 mpi/man/man3/MPI_Cancel.3 mpi/man/man3/MPI_Cart_coords.3 mpi/man/man3/MPI_Cart_create.3 mpi/man/man3/MPI_Cartdim_get.3 mpi/man/man3/MPI_Cart_get.3 mpi/man/man3/MPI_Cart_map.3 mpi/man/man3/MPI_Cart_rank.3 '/opt/openmpi/share/man/man3'
 /usr/bin/install -c -m 644 mpi/man/man3/MPI_Cart_shift.3 mpi/man/man3/MPI_Cart_sub.3 mpi/man/man3/MPI_Close_port.3 mpi/man/man3/MPI_Comm_accept.3 mpi/man/man3/MPI_Comm_c2f.3 mpi/man/man3/MPI_Comm_call_errhandler.3 mpi/man/man3/MPI_Comm_compare.3 mpi/man/man3/MPI_Comm_connect.3 mpi/man/man3/MPI_Comm_create.3 mpi/man/man3/MPI_Comm_create_group.3 mpi/man/man3/MPI_Comm_create_errhandler.3 mpi/man/man3/MPI_Comm_create_keyval.3 mpi/man/man3/MPI_Comm_delete_attr.3 mpi/man/man3/MPI_Comm_disconnect.3 mpi/man/man3/MPI_Comm_dup.3 mpi/man/man3/MPI_Comm_dup_with_info.3 mpi/man/man3/MPI_Comm_idup.3 mpi/man/man3/MPI_Comm_f2c.3 mpi/man/man3/MPI_Comm_free.3 mpi/man/man3/MPI_Comm_free_keyval.3 mpi/man/man3/MPI_Comm_get_attr.3 mpi/man/man3/MPI_Comm_get_errhandler.3 mpi/man/man3/MPI_Comm_get_info.3 mpi/man/man3/MPI_Comm_get_name.3 mpi/man/man3/MPI_Comm_get_parent.3 mpi/man/man3/MPI_Comm_group.3 mpi/man/man3/MPI_Comm_join.3 mpi/man/man3/MPI_Comm_rank.3 mpi/man/man3/MPI_Comm_remote_group.3 mpi/man/man3/MPI_Comm_remote_size.3 mpi/man/man3/MPI_Comm_set_attr.3 mpi/man/man3/MPI_Comm_set_errhandler.3 mpi/man/man3/MPI_Comm_set_info.3 mpi/man/man3/MPI_Comm_set_name.3 mpi/man/man3/MPI_Comm_size.3 mpi/man/man3/MPI_Comm_spawn.3 mpi/man/man3/MPI_Comm_spawn_multiple.3 mpi/man/man3/MPI_Comm_split.3 mpi/man/man3/MPI_Comm_split_type.3 mpi/man/man3/MPI_Comm_test_inter.3 '/opt/openmpi/share/man/man3'
 /usr/bin/install -c -m 644 mpi/man/man3/MPI_Compare_and_swap.3 mpi/man/man3/MPI_Dims_create.3 mpi/man/man3/MPI_Dist_graph_create.3 mpi/man/man3/MPI_Dist_graph_create_adjacent.3 mpi/man/man3/MPI_Dist_graph_neighbors.3 mpi/man/man3/MPI_Dist_graph_neighbors_count.3 mpi/man/man3/MPI_Errhandler_create.3 mpi/man/man3/MPI_Errhandler_free.3 mpi/man/man3/MPI_Errhandler_get.3 mpi/man/man3/MPI_Errhandler_set.3 mpi/man/man3/MPI_Error_class.3 mpi/man/man3/MPI_Error_string.3 mpi/man/man3/MPI_Exscan.3 mpi/man/man3/MPI_Iexscan.3 mpi/man/man3/MPI_Fetch_and_op.3 mpi/man/man3/MPI_File_c2f.3 mpi/man/man3/MPI_File_call_errhandler.3 mpi/man/man3/MPI_File_close.3 mpi/man/man3/MPI_File_create_errhandler.3 mpi/man/man3/MPI_File_delete.3 mpi/man/man3/MPI_File_f2c.3 mpi/man/man3/MPI_File_get_amode.3 mpi/man/man3/MPI_File_get_atomicity.3 mpi/man/man3/MPI_File_get_byte_offset.3 mpi/man/man3/MPI_File_get_errhandler.3 mpi/man/man3/MPI_File_get_group.3 mpi/man/man3/MPI_File_get_info.3 mpi/man/man3/MPI_File_get_position.3 mpi/man/man3/MPI_File_get_position_shared.3 mpi/man/man3/MPI_File_get_size.3 mpi/man/man3/MPI_File_get_type_extent.3 mpi/man/man3/MPI_File_get_view.3 mpi/man/man3/MPI_File_iread.3 mpi/man/man3/MPI_File_iread_at.3 mpi/man/man3/MPI_File_iread_all.3 mpi/man/man3/MPI_File_iread_at_all.3 mpi/man/man3/MPI_File_iread_shared.3 mpi/man/man3/MPI_File_iwrite.3 mpi/man/man3/MPI_File_iwrite_at.3 mpi/man/man3/MPI_File_iwrite_all.3 '/opt/openmpi/share/man/man3'
 /usr/bin/install -c -m 644 mpi/man/man3/MPI_File_iwrite_at_all.3 mpi/man/man3/MPI_File_iwrite_shared.3 mpi/man/man3/MPI_File_open.3 mpi/man/man3/MPI_File_preallocate.3 mpi/man/man3/MPI_File_read.3 mpi/man/man3/MPI_File_read_all.3 mpi/man/man3/MPI_File_read_all_begin.3 mpi/man/man3/MPI_File_read_all_end.3 mpi/man/man3/MPI_File_read_at.3 mpi/man/man3/MPI_File_read_at_all.3 mpi/man/man3/MPI_File_read_at_all_begin.3 mpi/man/man3/MPI_File_read_at_all_end.3 mpi/man/man3/MPI_File_read_ordered.3 mpi/man/man3/MPI_File_read_ordered_begin.3 mpi/man/man3/MPI_File_read_ordered_end.3 mpi/man/man3/MPI_File_read_shared.3 mpi/man/man3/MPI_File_seek.3 mpi/man/man3/MPI_File_seek_shared.3 mpi/man/man3/MPI_File_set_atomicity.3 mpi/man/man3/MPI_File_set_errhandler.3 mpi/man/man3/MPI_File_set_info.3 mpi/man/man3/MPI_File_set_size.3 mpi/man/man3/MPI_File_set_view.3 mpi/man/man3/MPI_File_sync.3 mpi/man/man3/MPI_File_write.3 mpi/man/man3/MPI_File_write_all.3 mpi/man/man3/MPI_File_write_all_begin.3 mpi/man/man3/MPI_File_write_all_end.3 mpi/man/man3/MPI_File_write_at.3 mpi/man/man3/MPI_File_write_at_all.3 mpi/man/man3/MPI_File_write_at_all_begin.3 mpi/man/man3/MPI_File_write_at_all_end.3 mpi/man/man3/MPI_File_write_ordered.3 mpi/man/man3/MPI_File_write_ordered_begin.3 mpi/man/man3/MPI_File_write_ordered_end.3 mpi/man/man3/MPI_File_write_shared.3 mpi/man/man3/MPI_Finalize.3 mpi/man/man3/MPI_Finalized.3 mpi/man/man3/MPI_Free_mem.3 mpi/man/man3/MPI_Gather.3 '/opt/openmpi/share/man/man3'
 /usr/bin/install -c -m 644 mpi/man/man3/MPI_Igather.3 mpi/man/man3/MPI_Gatherv.3 mpi/man/man3/MPI_Igatherv.3 mpi/man/man3/MPI_Get.3 mpi/man/man3/MPI_Get_accumulate.3 mpi/man/man3/MPI_Get_address.3 mpi/man/man3/MPI_Get_count.3 mpi/man/man3/MPI_Get_elements.3 mpi/man/man3/MPI_Get_elements_x.3 mpi/man/man3/MPI_Get_library_version.3 mpi/man/man3/MPI_Get_processor_name.3 mpi/man/man3/MPI_Get_version.3 mpi/man/man3/MPI_Graph_create.3 mpi/man/man3/MPI_Graphdims_get.3 mpi/man/man3/MPI_Graph_get.3 mpi/man/man3/MPI_Graph_map.3 mpi/man/man3/MPI_Graph_neighbors.3 mpi/man/man3/MPI_Graph_neighbors_count.3 mpi/man/man3/MPI_Grequest_complete.3 mpi/man/man3/MPI_Grequest_start.3 mpi/man/man3/MPI_Group_c2f.3 mpi/man/man3/MPI_Group_compare.3 mpi/man/man3/MPI_Group_difference.3 mpi/man/man3/MPI_Group_excl.3 mpi/man/man3/MPI_Group_f2c.3 mpi/man/man3/MPI_Group_free.3 mpi/man/man3/MPI_Group_incl.3 mpi/man/man3/MPI_Group_intersection.3 mpi/man/man3/MPI_Group_range_excl.3 mpi/man/man3/MPI_Group_range_incl.3 mpi/man/man3/MPI_Group_rank.3 mpi/man/man3/MPI_Group_size.3 mpi/man/man3/MPI_Group_translate_ranks.3 mpi/man/man3/MPI_Group_union.3 mpi/man/man3/MPI_Ibsend.3 mpi/man/man3/MPI_Improbe.3 mpi/man/man3/MPI_Imrecv.3 mpi/man/man3/MPI_Info_c2f.3 mpi/man/man3/MPI_Info_create.3 mpi/man/man3/MPI_Info_delete.3 '/opt/openmpi/share/man/man3'
 /usr/bin/install -c -m 644 mpi/man/man3/MPI_Info_dup.3 mpi/man/man3/MPI_Info_env.3 mpi/man/man3/MPI_Info_f2c.3 mpi/man/man3/MPI_Info_free.3 mpi/man/man3/MPI_Info_get.3 mpi/man/man3/MPI_Info_get_nkeys.3 mpi/man/man3/MPI_Info_get_nthkey.3 mpi/man/man3/MPI_Info_get_valuelen.3 mpi/man/man3/MPI_Info_set.3 mpi/man/man3/MPI_Init.3 mpi/man/man3/MPI_Initialized.3 mpi/man/man3/MPI_Init_thread.3 mpi/man/man3/MPI_Intercomm_create.3 mpi/man/man3/MPI_Intercomm_merge.3 mpi/man/man3/MPI_Iprobe.3 mpi/man/man3/MPI_Irecv.3 mpi/man/man3/MPI_Irsend.3 mpi/man/man3/MPI_Isend.3 mpi/man/man3/MPI_Issend.3 mpi/man/man3/MPI_Is_thread_main.3 mpi/man/man3/MPI_Keyval_create.3 mpi/man/man3/MPI_Keyval_free.3 mpi/man/man3/MPI_Lookup_name.3 mpi/man/man3/MPI_Message_c2f.3 mpi/man/man3/MPI_Message_f2c.3 mpi/man/man3/MPI_Mprobe.3 mpi/man/man3/MPI_Mrecv.3 mpi/man/man3/MPI_Neighbor_allgather.3 mpi/man/man3/MPI_Ineighbor_allgather.3 mpi/man/man3/MPI_Neighbor_allgatherv.3 mpi/man/man3/MPI_Ineighbor_allgatherv.3 mpi/man/man3/MPI_Neighbor_alltoall.3 mpi/man/man3/MPI_Ineighbor_alltoall.3 mpi/man/man3/MPI_Neighbor_alltoallv.3 mpi/man/man3/MPI_Ineighbor_alltoallv.3 mpi/man/man3/MPI_Neighbor_alltoallw.3 mpi/man/man3/MPI_Ineighbor_alltoallw.3 mpi/man/man3/MPI_Op_c2f.3 mpi/man/man3/MPI_Op_commutative.3 mpi/man/man3/MPI_Op_create.3 '/opt/openmpi/share/man/man3'
 /usr/bin/install -c -m 644 mpi/man/man3/MPI_Open_port.3 mpi/man/man3/MPI_Op_f2c.3 mpi/man/man3/MPI_Op_free.3 mpi/man/man3/MPI_Pack.3 mpi/man/man3/MPI_Pack_external.3 mpi/man/man3/MPI_Pack_external_size.3 mpi/man/man3/MPI_Pack_size.3 mpi/man/man3/MPI_Pcontrol.3 mpi/man/man3/MPI_Probe.3 mpi/man/man3/MPI_Publish_name.3 mpi/man/man3/MPI_Put.3 mpi/man/man3/MPI_Query_thread.3 mpi/man/man3/MPI_Raccumulate.3 mpi/man/man3/MPI_Recv.3 mpi/man/man3/MPI_Recv_init.3 mpi/man/man3/MPI_Reduce.3 mpi/man/man3/MPI_Ireduce.3 mpi/man/man3/MPI_Reduce_local.3 mpi/man/man3/MPI_Reduce_scatter.3 mpi/man/man3/MPI_Ireduce_scatter.3 mpi/man/man3/MPI_Reduce_scatter_block.3 mpi/man/man3/MPI_Ireduce_scatter_block.3 mpi/man/man3/MPI_Register_datarep.3 mpi/man/man3/MPI_Request_c2f.3 mpi/man/man3/MPI_Request_f2c.3 mpi/man/man3/MPI_Request_free.3 mpi/man/man3/MPI_Request_get_status.3 mpi/man/man3/MPI_Rget.3 mpi/man/man3/MPI_Rget_accumulate.3 mpi/man/man3/MPI_Rput.3 mpi/man/man3/MPI_Rsend.3 mpi/man/man3/MPI_Rsend_init.3 mpi/man/man3/MPI_Scan.3 mpi/man/man3/MPI_Iscan.3 mpi/man/man3/MPI_Scatter.3 mpi/man/man3/MPI_Iscatter.3 mpi/man/man3/MPI_Scatterv.3 mpi/man/man3/MPI_Iscatterv.3 mpi/man/man3/MPI_Send.3 mpi/man/man3/MPI_Send_init.3 '/opt/openmpi/share/man/man3'
 /usr/bin/install -c -m 644 mpi/man/man3/MPI_Sendrecv.3 mpi/man/man3/MPI_Sendrecv_replace.3 mpi/man/man3/MPI_Sizeof.3 mpi/man/man3/MPI_Ssend.3 mpi/man/man3/MPI_Ssend_init.3 mpi/man/man3/MPI_Start.3 mpi/man/man3/MPI_Startall.3 mpi/man/man3/MPI_Status_c2f.3 mpi/man/man3/MPI_Status_f2c.3 mpi/man/man3/MPI_Status_set_cancelled.3 mpi/man/man3/MPI_Status_set_elements.3 mpi/man/man3/MPI_Status_set_elements_x.3 mpi/man/man3/MPI_T_category_changed.3 mpi/man/man3/MPI_T_category_get_categories.3 mpi/man/man3/MPI_T_category_get_cvars.3 mpi/man/man3/MPI_T_category_get_info.3 mpi/man/man3/MPI_T_category_get_num.3 mpi/man/man3/MPI_T_category_get_pvars.3 mpi/man/man3/MPI_T_cvar_get_info.3 mpi/man/man3/MPI_T_cvar_get_num.3 mpi/man/man3/MPI_T_cvar_handle_alloc.3 mpi/man/man3/MPI_T_cvar_handle_free.3 mpi/man/man3/MPI_T_cvar_read.3 mpi/man/man3/MPI_T_cvar_write.3 mpi/man/man3/MPI_T_enum_get_info.3 mpi/man/man3/MPI_T_enum_get_item.3 mpi/man/man3/MPI_T_finalize.3 mpi/man/man3/MPI_T_init_thread.3 mpi/man/man3/MPI_T_pvar_get_info.3 mpi/man/man3/MPI_T_pvar_get_num.3 mpi/man/man3/MPI_T_pvar_handle_alloc.3 mpi/man/man3/MPI_T_pvar_handle_free.3 mpi/man/man3/MPI_T_pvar_read.3 mpi/man/man3/MPI_T_pvar_readreset.3 mpi/man/man3/MPI_T_pvar_reset.3 mpi/man/man3/MPI_T_pvar_session_create.3 mpi/man/man3/MPI_T_pvar_session_free.3 mpi/man/man3/MPI_T_pvar_start.3 mpi/man/man3/MPI_T_pvar_stop.3 mpi/man/man3/MPI_T_pvar_write.3 '/opt/openmpi/share/man/man3'
 /usr/bin/install -c -m 644 mpi/man/man3/MPI_Test.3 mpi/man/man3/MPI_Testall.3 mpi/man/man3/MPI_Testany.3 mpi/man/man3/MPI_Test_cancelled.3 mpi/man/man3/MPI_Testsome.3 mpi/man/man3/MPI_Topo_test.3 mpi/man/man3/MPI_Type_c2f.3 mpi/man/man3/MPI_Type_commit.3 mpi/man/man3/MPI_Type_contiguous.3 mpi/man/man3/MPI_Type_create_darray.3 mpi/man/man3/MPI_Type_create_f90_complex.3 mpi/man/man3/MPI_Type_create_f90_integer.3 mpi/man/man3/MPI_Type_create_f90_real.3 mpi/man/man3/MPI_Type_create_hindexed.3 mpi/man/man3/MPI_Type_create_hindexed_block.3 mpi/man/man3/MPI_Type_create_hvector.3 mpi/man/man3/MPI_Type_create_indexed_block.3 mpi/man/man3/MPI_Type_create_keyval.3 mpi/man/man3/MPI_Type_create_resized.3 mpi/man/man3/MPI_Type_create_struct.3 mpi/man/man3/MPI_Type_create_subarray.3 mpi/man/man3/MPI_Type_delete_attr.3 mpi/man/man3/MPI_Type_dup.3 mpi/man/man3/MPI_Type_extent.3 mpi/man/man3/MPI_Type_f2c.3 mpi/man/man3/MPI_Type_free.3 mpi/man/man3/MPI_Type_free_keyval.3 mpi/man/man3/MPI_Type_get_attr.3 mpi/man/man3/MPI_Type_get_contents.3 mpi/man/man3/MPI_Type_get_envelope.3 mpi/man/man3/MPI_Type_get_extent.3 mpi/man/man3/MPI_Type_get_extent_x.3 mpi/man/man3/MPI_Type_get_name.3 mpi/man/man3/MPI_Type_get_true_extent.3 mpi/man/man3/MPI_Type_get_true_extent_x.3 mpi/man/man3/MPI_Type_hindexed.3 mpi/man/man3/MPI_Type_hvector.3 mpi/man/man3/MPI_Type_indexed.3 mpi/man/man3/MPI_Type_lb.3 mpi/man/man3/MPI_Type_match_size.3 '/opt/openmpi/share/man/man3'
 /usr/bin/install -c -m 644 mpi/man/man3/MPI_Type_set_attr.3 mpi/man/man3/MPI_Type_set_name.3 mpi/man/man3/MPI_Type_size.3 mpi/man/man3/MPI_Type_size_x.3 mpi/man/man3/MPI_Type_struct.3 mpi/man/man3/MPI_Type_ub.3 mpi/man/man3/MPI_Type_vector.3 mpi/man/man3/MPI_Unpack.3 mpi/man/man3/MPI_Unpack_external.3 mpi/man/man3/MPI_Unpublish_name.3 mpi/man/man3/MPI_Wait.3 mpi/man/man3/MPI_Waitall.3 mpi/man/man3/MPI_Waitany.3 mpi/man/man3/MPI_Waitsome.3 mpi/man/man3/MPI_Win_allocate.3 mpi/man/man3/MPI_Win_allocate_shared.3 mpi/man/man3/MPI_Win_attach.3 mpi/man/man3/MPI_Win_c2f.3 mpi/man/man3/MPI_Win_call_errhandler.3 mpi/man/man3/MPI_Win_complete.3 mpi/man/man3/MPI_Win_create.3 mpi/man/man3/MPI_Win_create_dynamic.3 mpi/man/man3/MPI_Win_create_errhandler.3 mpi/man/man3/MPI_Win_create_keyval.3 mpi/man/man3/MPI_Win_delete_attr.3 mpi/man/man3/MPI_Win_detach.3 mpi/man/man3/MPI_Win_f2c.3 mpi/man/man3/MPI_Win_fence.3 mpi/man/man3/MPI_Win_flush.3 mpi/man/man3/MPI_Win_flush_all.3 mpi/man/man3/MPI_Win_flush_local.3 mpi/man/man3/MPI_Win_flush_local_all.3 mpi/man/man3/MPI_Win_free.3 mpi/man/man3/MPI_Win_free_keyval.3 mpi/man/man3/MPI_Win_get_attr.3 mpi/man/man3/MPI_Win_get_errhandler.3 mpi/man/man3/MPI_Win_get_group.3 mpi/man/man3/MPI_Win_get_info.3 mpi/man/man3/MPI_Win_get_name.3 mpi/man/man3/MPI_Win_lock.3 '/opt/openmpi/share/man/man3'
 /usr/bin/install -c -m 644 mpi/man/man3/MPI_Win_lock_all.3 mpi/man/man3/MPI_Win_post.3 mpi/man/man3/MPI_Win_set_attr.3 mpi/man/man3/MPI_Win_set_errhandler.3 mpi/man/man3/MPI_Win_set_info.3 mpi/man/man3/MPI_Win_set_name.3 mpi/man/man3/MPI_Win_shared_query.3 mpi/man/man3/MPI_Win_start.3 mpi/man/man3/MPI_Win_sync.3 mpi/man/man3/MPI_Win_test.3 mpi/man/man3/MPI_Win_unlock.3 mpi/man/man3/MPI_Win_unlock_all.3 mpi/man/man3/MPI_Win_wait.3 mpi/man/man3/MPI_Wtick.3 mpi/man/man3/MPI_Wtime.3 mpi/man/man3/OpenMPI.3 '/opt/openmpi/share/man/man3'
libtool: install: /usr/bin/install -c .libs/libmpi.so.40.20.2T /opt/openmpi/lib/libmpi.so.40.20.2
libtool: install: (cd /opt/openmpi/lib && { ln -s -f libmpi.so.40.20.2 libmpi.so.40 || { rm -f libmpi.so.40 && ln -s libmpi.so.40.20.2 libmpi.so.40; }; })
libtool: install: (cd /opt/openmpi/lib && { ln -s -f libmpi.so.40.20.2 libmpi.so || { rm -f libmpi.so && ln -s libmpi.so.40.20.2 libmpi.so; }; })
libtool: install: /usr/bin/install -c .libs/libmpi.lai /opt/openmpi/lib/libmpi.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/ompi'
make[2]: Leaving directory '/scratch/build/ompi'
Making install in mpi/cxx
make[2]: Entering directory '/scratch/build/ompi/mpi/cxx'
make[3]: Entering directory '/scratch/build/ompi/mpi/cxx'
make[3]: Leaving directory '/scratch/build/ompi/mpi/cxx'
make[2]: Leaving directory '/scratch/build/ompi/mpi/cxx'
Making install in mpiext/pcollreq/mpif-h
make[2]: Entering directory '/scratch/build/ompi/mpiext/pcollreq/mpif-h'
Making install in profile
make[3]: Entering directory '/scratch/build/ompi/mpiext/pcollreq/mpif-h/profile'
make[4]: Entering directory '/scratch/build/ompi/mpiext/pcollreq/mpif-h/profile'
make[4]: Nothing to be done for 'install-exec-am'.
make[4]: Leaving directory '/scratch/build/ompi/mpiext/pcollreq/mpif-h/profile'
make[3]: Leaving directory '/scratch/build/ompi/mpiext/pcollreq/mpif-h/profile'
make[3]: Entering directory '/scratch/build/ompi/mpiext/pcollreq/mpif-h'
make[4]: Entering directory '/scratch/build/ompi/mpiext/pcollreq/mpif-h'
make[4]: Nothing to be done for 'install-exec-am'.
make[4]: Leaving directory '/scratch/build/ompi/mpiext/pcollreq/mpif-h'
make[3]: Leaving directory '/scratch/build/ompi/mpiext/pcollreq/mpif-h'
make[2]: Leaving directory '/scratch/build/ompi/mpiext/pcollreq/mpif-h'
Making install in mpi/fortran/mpif-h
make[2]: Entering directory '/scratch/build/ompi/mpi/fortran/mpif-h'
Making install in profile
make[3]: Entering directory '/scratch/build/ompi/mpi/fortran/mpif-h/profile'
make[4]: Entering directory '/scratch/build/ompi/mpi/fortran/mpif-h/profile'
make[4]: Nothing to be done for 'install-exec-am'.
make[4]: Nothing to be done for 'install-data-am'.
make[4]: Leaving directory '/scratch/build/ompi/mpi/fortran/mpif-h/profile'
make[3]: Leaving directory '/scratch/build/ompi/mpi/fortran/mpif-h/profile'
make[3]: Entering directory '/scratch/build/ompi/mpi/fortran/mpif-h'
make[4]: Entering directory '/scratch/build/ompi/mpi/fortran/mpif-h'
make[4]: Leaving directory '/scratch/build/ompi/mpi/fortran/mpif-h'
make[3]: Leaving directory '/scratch/build/ompi/mpi/fortran/mpif-h'
make[2]: Leaving directory '/scratch/build/ompi/mpi/fortran/mpif-h'
Making install in mpi/fortran/mpiext-use-mpi
make[2]: Entering directory '/scratch/build/ompi/mpi/fortran/mpiext-use-mpi'
make[3]: Entering directory '/scratch/build/ompi/mpi/fortran/mpiext-use-mpi'
make  install-exec-hook
make[3]: Nothing to be done for 'install-data-am'.
make[4]: Entering directory '/scratch/build/ompi/mpi/fortran/mpiext-use-mpi'
make[4]: Nothing to be done for 'install-exec-hook'.
make[4]: Leaving directory '/scratch/build/ompi/mpi/fortran/mpiext-use-mpi'
make[3]: Leaving directory '/scratch/build/ompi/mpi/fortran/mpiext-use-mpi'
make[2]: Leaving directory '/scratch/build/ompi/mpi/fortran/mpiext-use-mpi'
Making install in mpi/fortran/use-mpi-f08/mod
make[2]: Entering directory '/scratch/build/ompi/mpi/fortran/use-mpi-f08/mod'
make[3]: Entering directory '/scratch/build/ompi/mpi/fortran/use-mpi-f08/mod'
make  install-exec-hook
make[3]: Nothing to be done for 'install-data-am'.
make[4]: Entering directory '/scratch/build/ompi/mpi/fortran/use-mpi-f08/mod'
make[4]: Nothing to be done for 'install-exec-hook'.
make[4]: Leaving directory '/scratch/build/ompi/mpi/fortran/use-mpi-f08/mod'
make[3]: Leaving directory '/scratch/build/ompi/mpi/fortran/use-mpi-f08/mod'
make[2]: Leaving directory '/scratch/build/ompi/mpi/fortran/use-mpi-f08/mod'
Making install in mpi/fortran/use-mpi-f08/bindings
make[2]: Entering directory '/scratch/build/ompi/mpi/fortran/use-mpi-f08/bindings'
make[3]: Entering directory '/scratch/build/ompi/mpi/fortran/use-mpi-f08/bindings'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Nothing to be done for 'install-data-am'.
make[3]: Leaving directory '/scratch/build/ompi/mpi/fortran/use-mpi-f08/bindings'
make[2]: Leaving directory '/scratch/build/ompi/mpi/fortran/use-mpi-f08/bindings'
Making install in mpiext/pcollreq/use-mpi-f08
make[2]: Entering directory '/scratch/build/ompi/mpiext/pcollreq/use-mpi-f08'
make[3]: Entering directory '/scratch/build/ompi/mpiext/pcollreq/use-mpi-f08'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Nothing to be done for 'install-data-am'.
make[3]: Leaving directory '/scratch/build/ompi/mpiext/pcollreq/use-mpi-f08'
make[2]: Leaving directory '/scratch/build/ompi/mpiext/pcollreq/use-mpi-f08'
Making install in mpi/fortran/use-mpi-f08
make[2]: Entering directory '/scratch/build/ompi/mpi/fortran/use-mpi-f08'
make[3]: Entering directory '/scratch/build/ompi/mpi/fortran/use-mpi-f08'
make[3]: Nothing to be done for 'install-data-am'.
make  install-exec-hook
make[4]: Entering directory '/scratch/build/ompi/mpi/fortran/use-mpi-f08'
make[4]: Nothing to be done for 'install-exec-hook'.
make[4]: Leaving directory '/scratch/build/ompi/mpi/fortran/use-mpi-f08'
make[3]: Leaving directory '/scratch/build/ompi/mpi/fortran/use-mpi-f08'
make[2]: Leaving directory '/scratch/build/ompi/mpi/fortran/use-mpi-f08'
Making install in mpi/fortran/mpiext-use-mpi-f08
make[2]: Entering directory '/scratch/build/ompi/mpi/fortran/mpiext-use-mpi-f08'
make[3]: Entering directory '/scratch/build/ompi/mpi/fortran/mpiext-use-mpi-f08'
make  install-exec-hook
make[3]: Nothing to be done for 'install-data-am'.
make[4]: Entering directory '/scratch/build/ompi/mpi/fortran/mpiext-use-mpi-f08'
make[4]: Nothing to be done for 'install-exec-hook'.
make[4]: Leaving directory '/scratch/build/ompi/mpi/fortran/mpiext-use-mpi-f08'
make[3]: Leaving directory '/scratch/build/ompi/mpi/fortran/mpiext-use-mpi-f08'
make[2]: Leaving directory '/scratch/build/ompi/mpi/fortran/mpiext-use-mpi-f08'
Making install in mca/common/monitoring
make[2]: Entering directory '/scratch/build/ompi/mca/common/monitoring'
  CC       libmca_common_monitoring_la-common_monitoring.lo
  CC       libmca_common_monitoring_la-common_monitoring_coll.lo
  CC       monitoring_prof.lo
  LN_S     libmca_common_monitoring.la
  CCLD     ompi_monitoring_prof.la
  CCLD     libmca_common_monitoring.la
make[3]: Entering directory '/scratch/build/ompi/mca/common/monitoring'
 /usr/bin/mkdir -p '/opt/openmpi/bin'
 /usr/bin/install -c ../../../../../openmpi/ompi/mca/common/monitoring/profile2mat.pl ../../../../../openmpi/ompi/mca/common/monitoring/aggregate_profile.pl '/opt/openmpi/bin'
make[3]: Nothing to be done for 'install-data-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   libmca_common_monitoring.la ompi_monitoring_prof.la '/opt/openmpi/lib'
libtool: install: /usr/bin/install -c .libs/libmca_common_monitoring.so.50.10.0 /opt/openmpi/lib/libmca_common_monitoring.so.50.10.0
libtool: install: (cd /opt/openmpi/lib && { ln -s -f libmca_common_monitoring.so.50.10.0 libmca_common_monitoring.so.50 || { rm -f libmca_common_monitoring.so.50 && ln -s libmca_common_monitoring.so.50.10.0 libmca_common_monitoring.so.50; }; })
libtool: install: (cd /opt/openmpi/lib && { ln -s -f libmca_common_monitoring.so.50.10.0 libmca_common_monitoring.so || { rm -f libmca_common_monitoring.so && ln -s libmca_common_monitoring.so.50.10.0 libmca_common_monitoring.so; }; })
libtool: install: /usr/bin/install -c .libs/libmca_common_monitoring.lai /opt/openmpi/lib/libmca_common_monitoring.la
libtool: warning: relinking 'ompi_monitoring_prof.la'
libtool: install: (cd /scratch/build/ompi/mca/common/monitoring; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -shared -o ompi_monitoring_prof.la -rpath /opt/openmpi/lib monitoring_prof.lo ../../../../ompi/libmpi.la ../../../../opal/libopen-pal.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/ompi_monitoring_prof.soT /opt/openmpi/lib/ompi_monitoring_prof.so
libtool: install: /usr/bin/install -c .libs/ompi_monitoring_prof.lai /opt/openmpi/lib/ompi_monitoring_prof.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/ompi/mca/common/monitoring'
make[2]: Leaving directory '/scratch/build/ompi/mca/common/monitoring'
Making install in mca/common/ompio
make[2]: Entering directory '/scratch/build/ompi/mca/common/ompio'
  CC       libmca_common_ompio_la-common_ompio_aggregators.lo
  CC       libmca_common_ompio_la-common_ompio_print_queue.lo
  CC       libmca_common_ompio_la-common_ompio_request.lo
  CC       libmca_common_ompio_la-common_ompio_file_open.lo
  CC       libmca_common_ompio_la-common_ompio_file_view.lo
  CC       libmca_common_ompio_la-common_ompio_file_read.lo
  CC       libmca_common_ompio_la-common_ompio_file_write.lo
  CC       libmca_common_ompio_la-common_ompio_cuda.lo
  LN_S     libmca_common_ompio.la
  CCLD     libmca_common_ompio.la
make[3]: Entering directory '/scratch/build/ompi/mca/common/ompio'
 /usr/bin/mkdir -p '/opt/openmpi/lib'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   libmca_common_ompio.la '/opt/openmpi/lib'
libtool: install: /usr/bin/install -c .libs/libmca_common_ompio.so.41.19.2 /opt/openmpi/lib/libmca_common_ompio.so.41.19.2
libtool: install: (cd /opt/openmpi/lib && { ln -s -f libmca_common_ompio.so.41.19.2 libmca_common_ompio.so.41 || { rm -f libmca_common_ompio.so.41 && ln -s libmca_common_ompio.so.41.19.2 libmca_common_ompio.so.41; }; })
libtool: install: (cd /opt/openmpi/lib && { ln -s -f libmca_common_ompio.so.41.19.2 libmca_common_ompio.so || { rm -f libmca_common_ompio.so && ln -s libmca_common_ompio.so.41.19.2 libmca_common_ompio.so; }; })
libtool: install: /usr/bin/install -c .libs/libmca_common_ompio.lai /opt/openmpi/lib/libmca_common_ompio.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/ompi/mca/common/ompio'
make[2]: Leaving directory '/scratch/build/ompi/mca/common/ompio'
Making install in mca/bml/r2
make[2]: Entering directory '/scratch/build/ompi/mca/bml/r2'
  CC       bml_r2.lo
  CC       bml_r2_component.lo
  CC       bml_r2_ft.lo
  CCLD     mca_bml_r2.la
make[3]: Entering directory '/scratch/build/ompi/mca/bml/r2'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../../../../openmpi/ompi/mca/bml/r2/help-mca-bml-r2.txt '/opt/openmpi/share/openmpi'
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_bml_r2.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_bml_r2.la'
libtool: install: (cd /scratch/build/ompi/mca/bml/r2; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_bml_r2.la -rpath /opt/openmpi/lib/openmpi bml_r2.lo bml_r2_component.lo bml_r2_ft.lo ../../../../ompi/libmpi.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_bml_r2.soT /opt/openmpi/lib/openmpi/mca_bml_r2.so
libtool: install: /usr/bin/install -c .libs/mca_bml_r2.lai /opt/openmpi/lib/openmpi/mca_bml_r2.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/ompi/mca/bml/r2'
make[2]: Leaving directory '/scratch/build/ompi/mca/bml/r2'
Making install in mca/coll/basic
make[2]: Entering directory '/scratch/build/ompi/mca/coll/basic'
  CC       coll_basic_allgather.lo
  CC       coll_basic_allgatherv.lo
  CC       coll_basic_allreduce.lo
  CC       coll_basic_alltoall.lo
  CC       coll_basic_alltoallv.lo
  CC       coll_basic_alltoallw.lo
  CC       coll_basic_barrier.lo
  CC       coll_basic_bcast.lo
  CC       coll_basic_component.lo
  CC       coll_basic_gather.lo
  CC       coll_basic_gatherv.lo
  CC       coll_basic_module.lo
  CC       coll_basic_neighbor_allgather.lo
  CC       coll_basic_neighbor_allgatherv.lo
  CC       coll_basic_neighbor_alltoall.lo
  CC       coll_basic_neighbor_alltoallv.lo
  CC       coll_basic_neighbor_alltoallw.lo
  CC       coll_basic_reduce.lo
  CC       coll_basic_reduce_scatter.lo
  CC       coll_basic_reduce_scatter_block.lo
  CC       coll_basic_scan.lo
  CC       coll_basic_exscan.lo
  CC       coll_basic_scatter.lo
  CC       coll_basic_scatterv.lo
  CCLD     mca_coll_basic.la
make[3]: Entering directory '/scratch/build/ompi/mca/coll/basic'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_coll_basic.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_coll_basic.la'
libtool: install: (cd /scratch/build/ompi/mca/coll/basic; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_coll_basic.la -rpath /opt/openmpi/lib/openmpi coll_basic_allgather.lo coll_basic_allgatherv.lo coll_basic_allreduce.lo coll_basic_alltoall.lo coll_basic_alltoallv.lo coll_basic_alltoallw.lo coll_basic_barrier.lo coll_basic_bcast.lo coll_basic_component.lo coll_basic_gather.lo coll_basic_gatherv.lo coll_basic_module.lo coll_basic_neighbor_allgather.lo coll_basic_neighbor_allgatherv.lo coll_basic_neighbor_alltoall.lo coll_basic_neighbor_alltoallv.lo coll_basic_neighbor_alltoallw.lo coll_basic_reduce.lo coll_basic_reduce_scatter.lo coll_basic_reduce_scatter_block.lo coll_basic_scan.lo coll_basic_exscan.lo coll_basic_scatter.lo coll_basic_scatterv.lo ../../../../ompi/libmpi.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_coll_basic.soT /opt/openmpi/lib/openmpi/mca_coll_basic.so
libtool: install: /usr/bin/install -c .libs/mca_coll_basic.lai /opt/openmpi/lib/openmpi/mca_coll_basic.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/ompi/mca/coll/basic'
make[2]: Leaving directory '/scratch/build/ompi/mca/coll/basic'
Making install in mca/coll/inter
make[2]: Entering directory '/scratch/build/ompi/mca/coll/inter'
  CC       coll_inter.lo
  CC       coll_inter_allreduce.lo
  CC       coll_inter_allgather.lo
  CC       coll_inter_allgatherv.lo
  CC       coll_inter_gather.lo
  CC       coll_inter_gatherv.lo
  CC       coll_inter_scatter.lo
  CC       coll_inter_scatterv.lo
  CC       coll_inter_bcast.lo
  CC       coll_inter_component.lo
  CC       coll_inter_reduce.lo
  CCLD     mca_coll_inter.la
make[3]: Entering directory '/scratch/build/ompi/mca/coll/inter'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_coll_inter.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_coll_inter.la'
libtool: install: (cd /scratch/build/ompi/mca/coll/inter; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_coll_inter.la -rpath /opt/openmpi/lib/openmpi coll_inter.lo coll_inter_allreduce.lo coll_inter_allgather.lo coll_inter_allgatherv.lo coll_inter_gather.lo coll_inter_gatherv.lo coll_inter_scatter.lo coll_inter_scatterv.lo coll_inter_bcast.lo coll_inter_component.lo coll_inter_reduce.lo ../../../../ompi/libmpi.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_coll_inter.soT /opt/openmpi/lib/openmpi/mca_coll_inter.so
libtool: install: /usr/bin/install -c .libs/mca_coll_inter.lai /opt/openmpi/lib/openmpi/mca_coll_inter.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/ompi/mca/coll/inter'
make[2]: Leaving directory '/scratch/build/ompi/mca/coll/inter'
Making install in mca/coll/libnbc
make[2]: Entering directory '/scratch/build/ompi/mca/coll/libnbc'
  CC       coll_libnbc_component.lo
  CC       nbc.lo
  CC       nbc_iallgather.lo
  CC       nbc_iallgatherv.lo
  CC       nbc_iallreduce.lo
  CC       nbc_ialltoall.lo
  CC       nbc_ialltoallv.lo
  CC       nbc_ialltoallw.lo
  CC       nbc_ibarrier.lo
  CC       nbc_ibcast.lo
  CC       nbc_iexscan.lo
  CC       nbc_igather.lo
  CC       nbc_igatherv.lo
  CC       nbc_ineighbor_allgather.lo
  CC       nbc_ineighbor_allgatherv.lo
  CC       nbc_ineighbor_alltoall.lo
  CC       nbc_ineighbor_alltoallv.lo
  CC       nbc_ineighbor_alltoallw.lo
  CC       nbc_ireduce.lo
  CC       nbc_ireduce_scatter.lo
  CC       nbc_ireduce_scatter_block.lo
  CC       nbc_iscan.lo
  CC       nbc_iscatter.lo
  CC       nbc_iscatterv.lo
  CC       nbc_neighbor_helpers.lo
  CC       libdict/dict.lo
  CC       libdict/hb_tree.lo
  CCLD     mca_coll_libnbc.la
make[3]: Entering directory '/scratch/build/ompi/mca/coll/libnbc'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_coll_libnbc.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_coll_libnbc.la'
libtool: install: (cd /scratch/build/ompi/mca/coll/libnbc; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_coll_libnbc.la -rpath /opt/openmpi/lib/openmpi coll_libnbc_component.lo nbc.lo libdict/dict.lo libdict/hb_tree.lo nbc_iallgather.lo nbc_iallgatherv.lo nbc_iallreduce.lo nbc_ialltoall.lo nbc_ialltoallv.lo nbc_ialltoallw.lo nbc_ibarrier.lo nbc_ibcast.lo nbc_iexscan.lo nbc_igather.lo nbc_igatherv.lo nbc_ineighbor_allgather.lo nbc_ineighbor_allgatherv.lo nbc_ineighbor_alltoall.lo nbc_ineighbor_alltoallv.lo nbc_ineighbor_alltoallw.lo nbc_ireduce.lo nbc_ireduce_scatter.lo nbc_ireduce_scatter_block.lo nbc_iscan.lo nbc_iscatter.lo nbc_iscatterv.lo nbc_neighbor_helpers.lo ../../../../ompi/libmpi.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_coll_libnbc.soT /opt/openmpi/lib/openmpi/mca_coll_libnbc.so
libtool: install: /usr/bin/install -c .libs/mca_coll_libnbc.lai /opt/openmpi/lib/openmpi/mca_coll_libnbc.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/ompi/mca/coll/libnbc'
make[2]: Leaving directory '/scratch/build/ompi/mca/coll/libnbc'
Making install in mca/coll/self
make[2]: Entering directory '/scratch/build/ompi/mca/coll/self'
  CC       coll_self_allgather.lo
  CC       coll_self_allgatherv.lo
  CC       coll_self_alltoall.lo
  CC       coll_self_allreduce.lo
  CC       coll_self_alltoallv.lo
  CC       coll_self_alltoallw.lo
  CC       coll_self_barrier.lo
  CC       coll_self_bcast.lo
  CC       coll_self_component.lo
  CC       coll_self_gather.lo
  CC       coll_self_gatherv.lo
  CC       coll_self_module.lo
  CC       coll_self_reduce.lo
  CC       coll_self_reduce_scatter.lo
  CC       coll_self_scan.lo
  CC       coll_self_exscan.lo
  CC       coll_self_scatter.lo
  CC       coll_self_scatterv.lo
  CCLD     mca_coll_self.la
make[3]: Entering directory '/scratch/build/ompi/mca/coll/self'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_coll_self.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_coll_self.la'
libtool: install: (cd /scratch/build/ompi/mca/coll/self; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_coll_self.la -rpath /opt/openmpi/lib/openmpi coll_self_allgather.lo coll_self_allgatherv.lo coll_self_allreduce.lo coll_self_alltoall.lo coll_self_alltoallv.lo coll_self_alltoallw.lo coll_self_barrier.lo coll_self_bcast.lo coll_self_component.lo coll_self_gather.lo coll_self_gatherv.lo coll_self_module.lo coll_self_reduce.lo coll_self_reduce_scatter.lo coll_self_scan.lo coll_self_exscan.lo coll_self_scatter.lo coll_self_scatterv.lo ../../../../ompi/libmpi.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_coll_self.soT /opt/openmpi/lib/openmpi/mca_coll_self.so
libtool: install: /usr/bin/install -c .libs/mca_coll_self.lai /opt/openmpi/lib/openmpi/mca_coll_self.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/ompi/mca/coll/self'
make[2]: Leaving directory '/scratch/build/ompi/mca/coll/self'
Making install in mca/coll/sm
make[2]: Entering directory '/scratch/build/ompi/mca/coll/sm'
  CC       coll_sm_allreduce.lo
  CC       coll_sm_barrier.lo
  CC       coll_sm_component.lo
  CC       coll_sm_bcast.lo
  CC       coll_sm_module.lo
  CC       coll_sm_reduce.lo
  CCLD     mca_coll_sm.la
make[3]: Entering directory '/scratch/build/ompi/mca/coll/sm'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../../../../openmpi/ompi/mca/coll/sm/help-mpi-coll-sm.txt '/opt/openmpi/share/openmpi'
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_coll_sm.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_coll_sm.la'
libtool: install: (cd /scratch/build/ompi/mca/coll/sm; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_coll_sm.la -rpath /opt/openmpi/lib/openmpi coll_sm_allreduce.lo coll_sm_barrier.lo coll_sm_bcast.lo coll_sm_component.lo coll_sm_module.lo coll_sm_reduce.lo ../../../../ompi/libmpi.la /scratch/build/opal/mca/common/sm/libmca_common_sm.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_coll_sm.soT /opt/openmpi/lib/openmpi/mca_coll_sm.so
libtool: install: /usr/bin/install -c .libs/mca_coll_sm.lai /opt/openmpi/lib/openmpi/mca_coll_sm.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/ompi/mca/coll/sm'
make[2]: Leaving directory '/scratch/build/ompi/mca/coll/sm'
Making install in mca/coll/sync
make[2]: Entering directory '/scratch/build/ompi/mca/coll/sync'
  CC       coll_sync_component.lo
  CC       coll_sync_module.lo
  CC       coll_sync_bcast.lo
  CC       coll_sync_exscan.lo
  CC       coll_sync_gather.lo
  CC       coll_sync_gatherv.lo
  CC       coll_sync_reduce.lo
  CC       coll_sync_reduce_scatter.lo
  CC       coll_sync_scatter.lo
  CC       coll_sync_scan.lo
  CC       coll_sync_scatterv.lo
  CCLD     mca_coll_sync.la
make[3]: Entering directory '/scratch/build/ompi/mca/coll/sync'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../../../../openmpi/ompi/mca/coll/sync/help-coll-sync.txt '/opt/openmpi/share/openmpi'
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_coll_sync.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_coll_sync.la'
libtool: install: (cd /scratch/build/ompi/mca/coll/sync; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_coll_sync.la -rpath /opt/openmpi/lib/openmpi coll_sync_component.lo coll_sync_module.lo coll_sync_bcast.lo coll_sync_exscan.lo coll_sync_gather.lo coll_sync_gatherv.lo coll_sync_reduce.lo coll_sync_reduce_scatter.lo coll_sync_scan.lo coll_sync_scatter.lo coll_sync_scatterv.lo ../../../../ompi/libmpi.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_coll_sync.soT /opt/openmpi/lib/openmpi/mca_coll_sync.so
libtool: install: /usr/bin/install -c .libs/mca_coll_sync.lai /opt/openmpi/lib/openmpi/mca_coll_sync.la
Get:29 https://apt.repos.intel.com/oneapi all/main amd64 intel-oneapi-libdpstd-devel-2022.0.0 amd64 2022.0.0-25335 [214 kB]
Get:30 https://apt.repos.intel.com/oneapi all/main amd64 intel-oneapi-compiler-dpcpp-cpp-2023.0.0 amd64 2023.0.0-25370 [1776 B]
debconf: delaying package configuration, since apt-utils is not installed
Fetched 1215 MB in 1min 35s (12.8 MB/s)
Selecting previously unselected package intel-oneapi-common-licensing-2023.0.0.
(Reading database ... 
(Reading database ... 5%
(Reading database ... 10%
(Reading database ... 15%
(Reading database ... 20%
(Reading database ... 25%
(Reading database ... 30%
(Reading database ... 35%
(Reading database ... 40%
(Reading database ... 45%
(Reading database ... 50%
(Reading database ... 55%
(Reading database ... 60%
(Reading database ... 65%
(Reading database ... 70%
(Reading database ... 75%
(Reading database ... 80%
(Reading database ... 85%
(Reading database ... 90%
(Reading database ... 95%
(Reading database ... 100%
(Reading database ... 22947 files and directories currently installed.)
Preparing to unpack .../00-intel-oneapi-common-licensing-2023.0.0_2023.0.0-25325_all.deb ...
Unpacking intel-oneapi-common-licensing-2023.0.0 (2023.0.0-25325) ...
Selecting previously unselected package intel-oneapi-common-licensing-2023.1.0.
Preparing to unpack .../01-intel-oneapi-common-licensing-2023.1.0_2023.1.0-43473_all.deb ...
Unpacking intel-oneapi-common-licensing-2023.1.0 (2023.1.0-43473) ...
Selecting previously unselected package intel-oneapi-common-licensing-2023.2.0.
Preparing to unpack .../02-intel-oneapi-common-licensing-2023.2.0_2023.2.0-49462_all.deb ...
Unpacking intel-oneapi-common-licensing-2023.2.0 (2023.2.0-49462) ...
Selecting previously unselected package intel-oneapi-common-licensing-2024.0.
Preparing to unpack .../03-intel-oneapi-common-licensing-2024.0_2024.0.0-49406_all.deb ...
Unpacking intel-oneapi-common-licensing-2024.0 (2024.0.0-49406) ...
Selecting previously unselected package intel-oneapi-common-oneapi-vars-2024.0.
Preparing to unpack .../04-intel-oneapi-common-oneapi-vars-2024.0_2024.0.0-49406_all.deb ...
Unpacking intel-oneapi-common-oneapi-vars-2024.0 (2024.0.0-49406) ...
Selecting previously unselected package intel-oneapi-common-vars.
Preparing to unpack .../05-intel-oneapi-common-vars_2024.0.0-49406_all.deb ...
Unpacking intel-oneapi-common-vars (2024.0.0-49406) ...
Selecting previously unselected package intel-oneapi-compiler-cpp-eclipse-cfg-2024.0.
Preparing to unpack .../06-intel-oneapi-compiler-cpp-eclipse-cfg-2024.0_2024.0.2-49895_all.deb ...
Unpacking intel-oneapi-compiler-cpp-eclipse-cfg-2024.0 (2024.0.2-49895) ...
Selecting previously unselected package intel-oneapi-compiler-cpp-eclipse-cfg.
Preparing to unpack .../07-intel-oneapi-compiler-cpp-eclipse-cfg_2024.0.2-49895_all.deb ...
Unpacking intel-oneapi-compiler-cpp-eclipse-cfg (2024.0.2-49895) ...
Selecting previously unselected package intel-oneapi-compiler-dpcpp-eclipse-cfg-2024.0.
Preparing to unpack .../08-intel-oneapi-compiler-dpcpp-eclipse-cfg-2024.0_2024.0.2-49895_all.deb ...
Unpacking intel-oneapi-compiler-dpcpp-eclipse-cfg-2024.0 (2024.0.2-49895) ...
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/ompi/mca/coll/sync'
make[2]: Leaving directory '/scratch/build/ompi/mca/coll/sync'
Making install in mca/coll/tuned
make[2]: Entering directory '/scratch/build/ompi/mca/coll/tuned'
  CC       coll_tuned_decision_fixed.lo
  CC       coll_tuned_decision_dynamic.lo
  CC       coll_tuned_dynamic_file.lo
  CC       coll_tuned_dynamic_rules.lo
Selecting previously unselected package intel-oneapi-compiler-dpcpp-eclipse-cfg.
Preparing to unpack .../09-intel-oneapi-compiler-dpcpp-eclipse-cfg_2024.0.2-49895_all.deb ...
Unpacking intel-oneapi-compiler-dpcpp-eclipse-cfg (2024.0.2-49895) ...
Selecting previously unselected package intel-oneapi-icc-eclipse-plugin-cpp-2023.0.0.
Preparing to unpack .../10-intel-oneapi-icc-eclipse-plugin-cpp-2023.0.0_2023.0.0-25370_all.deb ...
  CC       coll_tuned_component.lo
  CC       coll_tuned_module.lo
  CC       coll_tuned_allgather_decision.lo
  CC       coll_tuned_allgatherv_decision.lo
  CC       coll_tuned_allreduce_decision.lo
Unpacking intel-oneapi-icc-eclipse-plugin-cpp-2023.0.0 (2023.0.0-25370) ...
Selecting previously unselected package intel-oneapi-compiler-dpcpp-cpp-common-2023.0.0.
Preparing to unpack .../11-intel-oneapi-compiler-dpcpp-cpp-common-2023.0.0_2023.0.0-25370_all.deb ...
Unpacking intel-oneapi-compiler-dpcpp-cpp-common-2023.0.0 (2023.0.0-25370) ...
  CC       coll_tuned_alltoall_decision.lo
  CC       coll_tuned_gather_decision.lo
  CC       coll_tuned_alltoallv_decision.lo
  CC       coll_tuned_barrier_decision.lo
  CC       coll_tuned_reduce_decision.lo
  CC       coll_tuned_bcast_decision.lo
  CC       coll_tuned_reduce_scatter_decision.lo
  CC       coll_tuned_scatter_decision.lo
  CC       coll_tuned_reduce_scatter_block_decision.lo
  CC       coll_tuned_exscan_decision.lo
  CC       coll_tuned_scan_decision.lo
  CCLD     mca_coll_tuned.la
make[3]: Entering directory '/scratch/build/ompi/mca/coll/tuned'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_coll_tuned.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_coll_tuned.la'
libtool: install: (cd /scratch/build/ompi/mca/coll/tuned; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_coll_tuned.la -rpath /opt/openmpi/lib/openmpi coll_tuned_decision_fixed.lo coll_tuned_decision_dynamic.lo coll_tuned_dynamic_file.lo coll_tuned_dynamic_rules.lo coll_tuned_component.lo coll_tuned_module.lo coll_tuned_allgather_decision.lo coll_tuned_allgatherv_decision.lo coll_tuned_allreduce_decision.lo coll_tuned_alltoall_decision.lo coll_tuned_gather_decision.lo coll_tuned_alltoallv_decision.lo coll_tuned_barrier_decision.lo coll_tuned_reduce_decision.lo coll_tuned_bcast_decision.lo coll_tuned_reduce_scatter_decision.lo coll_tuned_scatter_decision.lo coll_tuned_reduce_scatter_block_decision.lo coll_tuned_exscan_decision.lo coll_tuned_scan_decision.lo ../../../../ompi/libmpi.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_coll_tuned.soT /opt/openmpi/lib/openmpi/mca_coll_tuned.so
libtool: install: /usr/bin/install -c .libs/mca_coll_tuned.lai /opt/openmpi/lib/openmpi/mca_coll_tuned.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/ompi/mca/coll/tuned'
make[2]: Leaving directory '/scratch/build/ompi/mca/coll/tuned'
Making install in mca/coll/cuda
make[2]: Entering directory '/scratch/build/ompi/mca/coll/cuda'
  CC       coll_cuda_module.lo
  CC       coll_cuda_reduce.lo
  CC       coll_cuda_allreduce.lo
  CC       coll_cuda_reduce_scatter_block.lo
  CC       coll_cuda_component.lo
  CC       coll_cuda_scan.lo
  CC       coll_cuda_exscan.lo
  CCLD     mca_coll_cuda.la
make[3]: Entering directory '/scratch/build/ompi/mca/coll/cuda'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../../../../openmpi/ompi/mca/coll/cuda/help-mpi-coll-cuda.txt '/opt/openmpi/share/openmpi'
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_coll_cuda.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_coll_cuda.la'
libtool: install: (cd /scratch/build/ompi/mca/coll/cuda; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_coll_cuda.la -rpath /opt/openmpi/lib/openmpi coll_cuda_module.lo coll_cuda_reduce.lo coll_cuda_allreduce.lo coll_cuda_reduce_scatter_block.lo coll_cuda_component.lo coll_cuda_scan.lo coll_cuda_exscan.lo ../../../../ompi/libmpi.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_coll_cuda.soT /opt/openmpi/lib/openmpi/mca_coll_cuda.so
libtool: install: /usr/bin/install -c .libs/mca_coll_cuda.lai /opt/openmpi/lib/openmpi/mca_coll_cuda.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/ompi/mca/coll/cuda'
make[2]: Leaving directory '/scratch/build/ompi/mca/coll/cuda'
Making install in mca/coll/monitoring
make[2]: Entering directory '/scratch/build/ompi/mca/coll/monitoring'
  CC       coll_monitoring_allgather.lo
  CC       coll_monitoring_allgatherv.lo
  CC       coll_monitoring_alltoall.lo
  CC       coll_monitoring_allreduce.lo
Selecting previously unselected package intel-oneapi-condaindex.
Preparing to unpack .../12-intel-oneapi-condaindex_2023.2.0-49417_amd64.deb ...
Unpacking intel-oneapi-condaindex (2023.2.0-49417) ...
  CC       coll_monitoring_alltoallv.lo
  CC       coll_monitoring_alltoallw.lo
  CC       coll_monitoring_barrier.lo
  CC       coll_monitoring_bcast.lo
Selecting previously unselected package intel-oneapi-openmp-common-2023.0.0.
Preparing to unpack .../13-intel-oneapi-openmp-common-2023.0.0_2023.0.0-25370_all.deb ...
Unpacking intel-oneapi-openmp-common-2023.0.0 (2023.0.0-25370) ...
  CC       coll_monitoring_component.lo
  CC       coll_monitoring_exscan.lo
  CC       coll_monitoring_gather.lo
  CC       coll_monitoring_gatherv.lo
Selecting previously unselected package intel-oneapi-openmp-2023.0.0.
Preparing to unpack .../14-intel-oneapi-openmp-2023.0.0_2023.0.0-25370_amd64.deb ...
Unpacking intel-oneapi-openmp-2023.0.0 (2023.0.0-25370) ...
  CC       coll_monitoring_neighbor_allgather.lo
  CC       coll_monitoring_neighbor_allgatherv.lo
  CC       coll_monitoring_neighbor_alltoall.lo
  CC       coll_monitoring_neighbor_alltoallv.lo
  CC       coll_monitoring_neighbor_alltoallw.lo
  CC       coll_monitoring_reduce.lo
  CC       coll_monitoring_reduce_scatter.lo
  CC       coll_monitoring_reduce_scatter_block.lo
  CC       coll_monitoring_scan.lo
  CC       coll_monitoring_scatter.lo
  CC       coll_monitoring_scatterv.lo
  CCLD     mca_coll_monitoring.la
make[3]: Entering directory '/scratch/build/ompi/mca/coll/monitoring'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_coll_monitoring.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_coll_monitoring.la'
libtool: install: (cd /scratch/build/ompi/mca/coll/monitoring; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_coll_monitoring.la -rpath /opt/openmpi/lib/openmpi coll_monitoring_allgather.lo coll_monitoring_allgatherv.lo coll_monitoring_allreduce.lo coll_monitoring_alltoall.lo coll_monitoring_alltoallv.lo coll_monitoring_alltoallw.lo coll_monitoring_barrier.lo coll_monitoring_bcast.lo coll_monitoring_component.lo coll_monitoring_exscan.lo coll_monitoring_gather.lo coll_monitoring_gatherv.lo coll_monitoring_neighbor_allgather.lo coll_monitoring_neighbor_allgatherv.lo coll_monitoring_neighbor_alltoall.lo coll_monitoring_neighbor_alltoallv.lo coll_monitoring_neighbor_alltoallw.lo coll_monitoring_reduce.lo coll_monitoring_reduce_scatter.lo coll_monitoring_reduce_scatter_block.lo coll_monitoring_scan.lo coll_monitoring_scatter.lo coll_monitoring_scatterv.lo ../../../../ompi/libmpi.la /scratch/build/ompi/mca/common/monitoring/libmca_common_monitoring.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_coll_monitoring.soT /opt/openmpi/lib/openmpi/mca_coll_monitoring.so
libtool: install: /usr/bin/install -c .libs/mca_coll_monitoring.lai /opt/openmpi/lib/openmpi/mca_coll_monitoring.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/ompi/mca/coll/monitoring'
make[2]: Leaving directory '/scratch/build/ompi/mca/coll/monitoring'
Making install in mca/fbtl/posix
make[2]: Entering directory '/scratch/build/ompi/mca/fbtl/posix'
  CC       fbtl_posix.lo
  CC       fbtl_posix_component.lo
  CC       fbtl_posix_ipreadv.lo
  CC       fbtl_posix_preadv.lo
  CC       fbtl_posix_pwritev.lo
  CC       fbtl_posix_ipwritev.lo
  CC       fbtl_posix_lock.lo
  CCLD     mca_fbtl_posix.la
make[3]: Entering directory '/scratch/build/ompi/mca/fbtl/posix'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_fbtl_posix.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_fbtl_posix.la'
libtool: install: (cd /scratch/build/ompi/mca/fbtl/posix; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_fbtl_posix.la -rpath /opt/openmpi/lib/openmpi fbtl_posix.lo fbtl_posix_component.lo fbtl_posix_preadv.lo fbtl_posix_ipreadv.lo fbtl_posix_pwritev.lo fbtl_posix_ipwritev.lo fbtl_posix_lock.lo ../../../../ompi/libmpi.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_fbtl_posix.soT /opt/openmpi/lib/openmpi/mca_fbtl_posix.so
libtool: install: /usr/bin/install -c .libs/mca_fbtl_posix.lai /opt/openmpi/lib/openmpi/mca_fbtl_posix.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/ompi/mca/fbtl/posix'
make[2]: Leaving directory '/scratch/build/ompi/mca/fbtl/posix'
Making install in mca/fcoll/dynamic
make[2]: Entering directory '/scratch/build/ompi/mca/fcoll/dynamic'
  CC       fcoll_dynamic_module.lo
  CC       fcoll_dynamic_component.lo
  CC       fcoll_dynamic_file_write_all.lo
  CC       fcoll_dynamic_file_read_all.lo
  CCLD     mca_fcoll_dynamic.la
make[3]: Entering directory '/scratch/build/ompi/mca/fcoll/dynamic'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_fcoll_dynamic.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_fcoll_dynamic.la'
libtool: install: (cd /scratch/build/ompi/mca/fcoll/dynamic; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_fcoll_dynamic.la -rpath /opt/openmpi/lib/openmpi fcoll_dynamic_module.lo fcoll_dynamic_component.lo fcoll_dynamic_file_read_all.lo fcoll_dynamic_file_write_all.lo /scratch/build/ompi/mca/common/ompio/libmca_common_ompio.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_fcoll_dynamic.soT /opt/openmpi/lib/openmpi/mca_fcoll_dynamic.so
libtool: install: /usr/bin/install -c .libs/mca_fcoll_dynamic.lai /opt/openmpi/lib/openmpi/mca_fcoll_dynamic.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/ompi/mca/fcoll/dynamic'
make[2]: Leaving directory '/scratch/build/ompi/mca/fcoll/dynamic'
Making install in mca/fcoll/dynamic_gen2
make[2]: Entering directory '/scratch/build/ompi/mca/fcoll/dynamic_gen2'
  CC       fcoll_dynamic_gen2_module.lo
  CC       fcoll_dynamic_gen2_component.lo
  CC       fcoll_dynamic_gen2_file_read_all.lo
  CC       fcoll_dynamic_gen2_file_write_all.lo
  CCLD     mca_fcoll_dynamic_gen2.la
make[3]: Entering directory '/scratch/build/ompi/mca/fcoll/dynamic_gen2'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_fcoll_dynamic_gen2.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_fcoll_dynamic_gen2.la'
libtool: install: (cd /scratch/build/ompi/mca/fcoll/dynamic_gen2; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_fcoll_dynamic_gen2.la -rpath /opt/openmpi/lib/openmpi fcoll_dynamic_gen2_module.lo fcoll_dynamic_gen2_component.lo fcoll_dynamic_gen2_file_read_all.lo fcoll_dynamic_gen2_file_write_all.lo /scratch/build/ompi/mca/common/ompio/libmca_common_ompio.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_fcoll_dynamic_gen2.soT /opt/openmpi/lib/openmpi/mca_fcoll_dynamic_gen2.so
libtool: install: /usr/bin/install -c .libs/mca_fcoll_dynamic_gen2.lai /opt/openmpi/lib/openmpi/mca_fcoll_dynamic_gen2.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/ompi/mca/fcoll/dynamic_gen2'
make[2]: Leaving directory '/scratch/build/ompi/mca/fcoll/dynamic_gen2'
Making install in mca/fcoll/individual
make[2]: Entering directory '/scratch/build/ompi/mca/fcoll/individual'
  CC       fcoll_individual_module.lo
  CC       fcoll_individual_component.lo
  CC       fcoll_individual_file_write_all.lo
  CC       fcoll_individual_file_read_all.lo
  CCLD     mca_fcoll_individual.la
make[3]: Entering directory '/scratch/build/ompi/mca/fcoll/individual'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_fcoll_individual.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_fcoll_individual.la'
libtool: install: (cd /scratch/build/ompi/mca/fcoll/individual; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_fcoll_individual.la -rpath /opt/openmpi/lib/openmpi fcoll_individual_module.lo fcoll_individual_component.lo fcoll_individual_file_read_all.lo fcoll_individual_file_write_all.lo /scratch/build/ompi/mca/common/ompio/libmca_common_ompio.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_fcoll_individual.soT /opt/openmpi/lib/openmpi/mca_fcoll_individual.so
libtool: install: /usr/bin/install -c .libs/mca_fcoll_individual.lai /opt/openmpi/lib/openmpi/mca_fcoll_individual.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/ompi/mca/fcoll/individual'
make[2]: Leaving directory '/scratch/build/ompi/mca/fcoll/individual'
Making install in mca/fcoll/two_phase
make[2]: Entering directory '/scratch/build/ompi/mca/fcoll/two_phase'
  CC       fcoll_two_phase_module.lo
  CC       fcoll_two_phase_component.lo
  CC       fcoll_two_phase_file_write_all.lo
  CC       fcoll_two_phase_file_read_all.lo
  CC       fcoll_two_phase_support_fns.lo
  CCLD     mca_fcoll_two_phase.la
make[3]: Entering directory '/scratch/build/ompi/mca/fcoll/two_phase'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_fcoll_two_phase.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_fcoll_two_phase.la'
libtool: install: (cd /scratch/build/ompi/mca/fcoll/two_phase; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_fcoll_two_phase.la -rpath /opt/openmpi/lib/openmpi fcoll_two_phase_module.lo fcoll_two_phase_component.lo fcoll_two_phase_file_read_all.lo fcoll_two_phase_file_write_all.lo fcoll_two_phase_support_fns.lo /scratch/build/ompi/mca/common/ompio/libmca_common_ompio.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_fcoll_two_phase.soT /opt/openmpi/lib/openmpi/mca_fcoll_two_phase.so
libtool: install: /usr/bin/install -c .libs/mca_fcoll_two_phase.lai /opt/openmpi/lib/openmpi/mca_fcoll_two_phase.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/ompi/mca/fcoll/two_phase'
make[2]: Leaving directory '/scratch/build/ompi/mca/fcoll/two_phase'
Making install in mca/fcoll/vulcan
make[2]: Entering directory '/scratch/build/ompi/mca/fcoll/vulcan'
  CC       fcoll_vulcan_module.lo
  CC       fcoll_vulcan_component.lo
  CC       fcoll_vulcan_file_read_all.lo
  CC       fcoll_vulcan_file_write_all.lo
  CCLD     mca_fcoll_vulcan.la
make[3]: Entering directory '/scratch/build/ompi/mca/fcoll/vulcan'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_fcoll_vulcan.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_fcoll_vulcan.la'
libtool: install: (cd /scratch/build/ompi/mca/fcoll/vulcan; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_fcoll_vulcan.la -rpath /opt/openmpi/lib/openmpi fcoll_vulcan_module.lo fcoll_vulcan_component.lo fcoll_vulcan_file_read_all.lo fcoll_vulcan_file_write_all.lo /scratch/build/ompi/mca/common/ompio/libmca_common_ompio.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_fcoll_vulcan.soT /opt/openmpi/lib/openmpi/mca_fcoll_vulcan.so
libtool: install: /usr/bin/install -c .libs/mca_fcoll_vulcan.lai /opt/openmpi/lib/openmpi/mca_fcoll_vulcan.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/ompi/mca/fcoll/vulcan'
make[2]: Leaving directory '/scratch/build/ompi/mca/fcoll/vulcan'
Making install in mca/fs/ufs
make[2]: Entering directory '/scratch/build/ompi/mca/fs/ufs'
  CC       fs_ufs.lo
  CC       fs_ufs_component.lo
  CC       fs_ufs_file_open.lo
  CCLD     mca_fs_ufs.la
make[3]: Entering directory '/scratch/build/ompi/mca/fs/ufs'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_fs_ufs.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_fs_ufs.la'
libtool: install: (cd /scratch/build/ompi/mca/fs/ufs; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_fs_ufs.la -rpath /opt/openmpi/lib/openmpi fs_ufs.lo fs_ufs_component.lo fs_ufs_file_open.lo ../../../../ompi/libmpi.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_fs_ufs.soT /opt/openmpi/lib/openmpi/mca_fs_ufs.so
libtool: install: /usr/bin/install -c .libs/mca_fs_ufs.lai /opt/openmpi/lib/openmpi/mca_fs_ufs.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/ompi/mca/fs/ufs'
make[2]: Leaving directory '/scratch/build/ompi/mca/fs/ufs'
Making install in mca/io/ompio
make[2]: Entering directory '/scratch/build/ompi/mca/io/ompio'
  CC       io_ompio.lo
  CC       io_ompio_component.lo
  CC       io_ompio_module.lo
  CC       io_ompio_file_set_view.lo
  CC       io_ompio_file_open.lo
  CC       io_ompio_file_write.lo
  CC       io_ompio_file_read.lo
  CCLD     mca_io_ompio.la
make[3]: Entering directory '/scratch/build/ompi/mca/io/ompio'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_io_ompio.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_io_ompio.la'
libtool: install: (cd /scratch/build/ompi/mca/io/ompio; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_io_ompio.la -rpath /opt/openmpi/lib/openmpi io_ompio.lo io_ompio_component.lo io_ompio_module.lo io_ompio_file_set_view.lo io_ompio_file_open.lo io_ompio_file_write.lo io_ompio_file_read.lo /scratch/build/ompi/mca/common/ompio/libmca_common_ompio.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_io_ompio.soT /opt/openmpi/lib/openmpi/mca_io_ompio.so
libtool: install: /usr/bin/install -c .libs/mca_io_ompio.lai /opt/openmpi/lib/openmpi/mca_io_ompio.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/ompi/mca/io/ompio'
make[2]: Leaving directory '/scratch/build/ompi/mca/io/ompio'
Making install in mca/io/romio321
make[2]: Entering directory '/scratch/build/ompi/mca/io/romio321'
Making install in romio
make[3]: Entering directory '/scratch/build/ompi/mca/io/romio321/romio'
make[4]: Entering directory '/scratch/build/ompi/mca/io/romio321/romio'
  CC       mpi-io/close.lo
  CC       mpi-io/delete.lo
  CC       mpi-io/fsync.lo
  CC       mpi-io/get_amode.lo
  CC       mpi-io/get_atom.lo
  CC       mpi-io/get_bytoff.lo
  CC       mpi-io/get_extent.lo
  CC       mpi-io/get_group.lo
  CC       mpi-io/get_info.lo
  CC       mpi-io/get_posn.lo
  CC       mpi-io/get_posn_sh.lo
  CC       mpi-io/get_size.lo
  CC       mpi-io/get_view.lo
  CC       mpi-io/iread.lo
  CC       mpi-io/iread_all.lo
  CC       mpi-io/iread_at.lo
Selecting previously unselected package intel-oneapi-compiler-shared-runtime-2023.0.0.
Preparing to unpack .../15-intel-oneapi-compiler-shared-runtime-2023.0.0_2023.0.0-25370_amd64.deb ...
Unpacking intel-oneapi-compiler-shared-runtime-2023.0.0 (2023.0.0-25370) ...
  CC       mpi-io/iread_atall.lo
  CC       mpi-io/iread_sh.lo
  CC       mpi-io/iwrite.lo
  CC       mpi-io/iwrite_all.lo
  CC       mpi-io/iwrite_at.lo
  CC       mpi-io/iwrite_atall.lo
  CC       mpi-io/iwrite_sh.lo
  CC       mpi-io/open.lo
  CC       mpi-io/prealloc.lo
  CC       mpi-io/rd_atallb.lo
  CC       mpi-io/rd_atalle.lo
  CC       mpi-io/read.lo
  CC       mpi-io/read_all.lo
  CC       mpi-io/read_allb.lo
  CC       mpi-io/read_alle.lo
  CC       mpi-io/read_at.lo
  CC       mpi-io/read_atall.lo
  CC       mpi-io/read_ord.lo
  CC       mpi-io/read_ordb.lo
  CC       mpi-io/read_orde.lo
  CC       mpi-io/read_sh.lo
  CC       mpi-io/register_datarep.lo
  CC       mpi-io/seek.lo
  CC       mpi-io/seek_sh.lo
  CC       mpi-io/set_atom.lo
  CC       mpi-io/set_info.lo
  CC       mpi-io/set_size.lo
  CC       mpi-io/set_view.lo
  CC       mpi-io/wr_atallb.lo
  CC       mpi-io/wr_atalle.lo
  CC       mpi-io/write.lo
  CC       mpi-io/write_all.lo
  CC       mpi-io/write_allb.lo
  CC       mpi-io/write_alle.lo
  CC       mpi-io/write_at.lo
  CC       mpi-io/write_atall.lo
  CC       mpi-io/write_ord.lo
  CC       mpi-io/write_ordb.lo
  CC       mpi-io/write_orde.lo
  CC       mpi-io/write_sh.lo
  CC       mpi-io/glue/openmpi/mpio_file.lo
  CC       mpi-io/glue/openmpi/mpio_err.lo
  CC       mpi-io/mpich_fileutil.lo
  CC       mpi-io/mpir-mpioinit.lo
  CC       mpi-io/mpiu_greq.lo
  CC       mpi-io/mpiu_external32.lo
  CC       adio/ad_nfs/ad_nfs_read.lo
  CC       adio/ad_nfs/ad_nfs_open.lo
  CC       adio/ad_nfs/ad_nfs_write.lo
  CC       adio/ad_nfs/ad_nfs_done.lo
  CC       adio/ad_nfs/ad_nfs_fcntl.lo
  CC       adio/ad_nfs/ad_nfs_iread.lo
  CC       adio/ad_nfs/ad_nfs_iwrite.lo
  CC       adio/ad_nfs/ad_nfs_wait.lo
  CC       adio/ad_nfs/ad_nfs_setsh.lo
  CC       adio/ad_nfs/ad_nfs_getsh.lo
  CC       adio/ad_nfs/ad_nfs.lo
  CC       adio/ad_nfs/ad_nfs_resize.lo
  CC       adio/ad_nfs/ad_nfs_features.lo
  CC       adio/ad_testfs/ad_testfs_close.lo
  CC       adio/ad_testfs/ad_testfs_read.lo
  CC       adio/ad_testfs/ad_testfs_rdcoll.lo
  CC       adio/ad_testfs/ad_testfs_wrcoll.lo
  CC       adio/ad_testfs/ad_testfs_open.lo
  CC       adio/ad_testfs/ad_testfs_write.lo
  CC       adio/ad_testfs/ad_testfs_done.lo
  CC       adio/ad_testfs/ad_testfs_fcntl.lo
  CC       adio/ad_testfs/ad_testfs_iread.lo
  CC       adio/ad_testfs/ad_testfs_iwrite.lo
  CC       adio/ad_testfs/ad_testfs_wait.lo
  CC       adio/ad_testfs/ad_testfs_flush.lo
  CC       adio/ad_testfs/ad_testfs_seek.lo
  CC       adio/ad_testfs/ad_testfs_resize.lo
  CC       adio/ad_testfs/ad_testfs_hints.lo
  CC       adio/ad_testfs/ad_testfs_delete.lo
  CC       adio/ad_testfs/ad_testfs.lo
  CC       adio/ad_ufs/ad_ufs.lo
  CC       adio/ad_ufs/ad_ufs_open.lo
  CC       adio/common/ad_aggregate.lo
  CC       adio/common/ad_aggregate_new.lo
  CC       adio/common/ad_close.lo
  CC       adio/common/ad_coll_build_req_new.lo
  CC       adio/common/ad_coll_exch_new.lo
  CC       adio/common/ad_darray.lo
  CC       adio/common/ad_delete.lo
  CC       adio/common/ad_done.lo
  CC       adio/common/ad_done_fake.lo
  CC       adio/common/ad_end.lo
  CC       adio/common/ad_fcntl.lo
  CC       adio/common/ad_features.lo
  CC       adio/common/ad_flush.lo
  CC       adio/common/ad_fstype.lo
  CC       adio/common/ad_get_sh_fp.lo
  CC       adio/common/ad_hints.lo
  CC       adio/common/ad_init.lo
  CC       adio/common/ad_io_coll.lo
  CC       adio/common/ad_iopen.lo
  CC       adio/common/ad_iread.lo
  CC       adio/common/ad_iread_coll.lo
  CC       adio/common/ad_iread_fake.lo
  CC       adio/common/ad_iwrite.lo
  CC       adio/common/ad_iwrite_coll.lo
  CC       adio/common/ad_iwrite_fake.lo
  CC       adio/common/ad_open.lo
  CC       adio/common/ad_opencoll.lo
  CC       adio/common/ad_opencoll_failsafe.lo
  CC       adio/common/ad_opencoll_scalable.lo
  CC       adio/common/ad_prealloc.lo
  CC       adio/common/ad_read.lo
  CC       adio/common/ad_read_coll.lo
  CC       adio/common/ad_read_str.lo
  CC       adio/common/ad_read_str_naive.lo
  CC       adio/common/ad_resize.lo
  CC       adio/common/ad_seek.lo
  CC       adio/common/ad_set_view.lo
  CC       adio/common/ad_set_sh_fp.lo
  CC       adio/common/ad_subarray.lo
  CC       adio/common/ad_wait.lo
  CC       adio/common/ad_wait_fake.lo
  CC       adio/common/ad_write.lo
  CC       adio/common/ad_write_coll.lo
  CC       adio/common/ad_write_nolock.lo
  CC       adio/common/ad_write_str_naive.lo
  CC       adio/common/ad_write_str.lo
  CC       adio/common/adi_close.lo
  CC       adio/common/byte_offset.lo
  CC       adio/common/cb_config_list.lo
  CC       adio/common/eof_offset.lo
  CC       adio/common/error.lo
  CC       adio/common/flatten.lo
  CC       adio/common/get_fp_posn.lo
  CC       adio/common/greq_fns.lo
  CC       adio/common/heap-sort.lo
  CC       adio/common/iscontig.lo
  CC       adio/common/lock.lo
  CC       adio/common/malloc.lo
  CC       adio/common/shfp_fname.lo
  CC       adio/common/status_setb.lo
  CC       adio/common/strfns.lo
  CC       adio/common/system_hints.lo
  CC       adio/common/hint_fns.lo
  CC       adio/common/ad_threaded_io.lo
  CC       adio/common/p2p_aggregation.lo
  CC       adio/common/onesided_aggregation.lo
  CC       adio/common/utils.lo
  CCLD     libromio_dist.la
ar: `u' modifier ignored since `D' is the default (see `U')
make[5]: Entering directory '/scratch/build/ompi/mca/io/romio321/romio'
make[5]: Leaving directory '/scratch/build/ompi/mca/io/romio321/romio'
make[4]: Leaving directory '/scratch/build/ompi/mca/io/romio321/romio'
make[3]: Leaving directory '/scratch/build/ompi/mca/io/romio321/romio'
make[3]: Entering directory '/scratch/build/ompi/mca/io/romio321'
  CC       src/io_romio321_component.lo
  CC       src/io_romio321_file_open.lo
  CC       src/io_romio321_file_read.lo
  CC       src/io_romio321_file_write.lo
  CC       src/io_romio321_module.lo
  CCLD     mca_io_romio321.la
make[4]: Entering directory '/scratch/build/ompi/mca/io/romio321'
make[4]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_io_romio321.la '/opt/openmpi/lib/openmpi'
libtool: install: /usr/bin/install -c .libs/mca_io_romio321.so /opt/openmpi/lib/openmpi/mca_io_romio321.so
libtool: install: /usr/bin/install -c .libs/mca_io_romio321.lai /opt/openmpi/lib/openmpi/mca_io_romio321.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[4]: Leaving directory '/scratch/build/ompi/mca/io/romio321'
make[3]: Leaving directory '/scratch/build/ompi/mca/io/romio321'
make[2]: Leaving directory '/scratch/build/ompi/mca/io/romio321'
Making install in mca/osc/sm
make[2]: Entering directory '/scratch/build/ompi/mca/osc/sm'
  CC       osc_sm_comm.lo
  CC       osc_sm_component.lo
  CC       osc_sm_active_target.lo
  CC       osc_sm_passive_target.lo
  CCLD     mca_osc_sm.la
make[3]: Entering directory '/scratch/build/ompi/mca/osc/sm'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_osc_sm.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_osc_sm.la'
libtool: install: (cd /scratch/build/ompi/mca/osc/sm; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_osc_sm.la -rpath /opt/openmpi/lib/openmpi osc_sm_comm.lo osc_sm_component.lo osc_sm_active_target.lo osc_sm_passive_target.lo ../../../../ompi/libmpi.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_osc_sm.soT /opt/openmpi/lib/openmpi/mca_osc_sm.so
libtool: install: /usr/bin/install -c .libs/mca_osc_sm.lai /opt/openmpi/lib/openmpi/mca_osc_sm.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/ompi/mca/osc/sm'
make[2]: Leaving directory '/scratch/build/ompi/mca/osc/sm'
Making install in mca/osc/monitoring
make[2]: Entering directory '/scratch/build/ompi/mca/osc/monitoring'
  CC       osc_monitoring_component.lo
  CCLD     mca_osc_monitoring.la
make[3]: Entering directory '/scratch/build/ompi/mca/osc/monitoring'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_osc_monitoring.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_osc_monitoring.la'
libtool: install: (cd /scratch/build/ompi/mca/osc/monitoring; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_osc_monitoring.la -rpath /opt/openmpi/lib/openmpi osc_monitoring_component.lo ../../../../ompi/libmpi.la /scratch/build/ompi/mca/common/monitoring/libmca_common_monitoring.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_osc_monitoring.soT /opt/openmpi/lib/openmpi/mca_osc_monitoring.so
libtool: install: /usr/bin/install -c .libs/mca_osc_monitoring.lai /opt/openmpi/lib/openmpi/mca_osc_monitoring.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/ompi/mca/osc/monitoring'
make[2]: Leaving directory '/scratch/build/ompi/mca/osc/monitoring'
Making install in mca/osc/pt2pt
make[2]: Entering directory '/scratch/build/ompi/mca/osc/pt2pt'
  CC       osc_pt2pt_module.lo
  CC       osc_pt2pt_comm.lo
  CC       osc_pt2pt_component.lo
  CC       osc_pt2pt_data_move.lo
  CC       osc_pt2pt_frag.lo
  CC       osc_pt2pt_request.lo
  CC       osc_pt2pt_active_target.lo
  CC       osc_pt2pt_passive_target.lo
  CC       osc_pt2pt_sync.lo
  CCLD     mca_osc_pt2pt.la
make[3]: Entering directory '/scratch/build/ompi/mca/osc/pt2pt'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../../../../openmpi/ompi/mca/osc/pt2pt/help-osc-pt2pt.txt '/opt/openmpi/share/openmpi'
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_osc_pt2pt.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_osc_pt2pt.la'
libtool: install: (cd /scratch/build/ompi/mca/osc/pt2pt; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_osc_pt2pt.la -rpath /opt/openmpi/lib/openmpi osc_pt2pt_module.lo osc_pt2pt_comm.lo osc_pt2pt_component.lo osc_pt2pt_data_move.lo osc_pt2pt_frag.lo osc_pt2pt_request.lo osc_pt2pt_active_target.lo osc_pt2pt_passive_target.lo osc_pt2pt_sync.lo ../../../../ompi/libmpi.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_osc_pt2pt.soT /opt/openmpi/lib/openmpi/mca_osc_pt2pt.so
libtool: install: /usr/bin/install -c .libs/mca_osc_pt2pt.lai /opt/openmpi/lib/openmpi/mca_osc_pt2pt.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/ompi/mca/osc/pt2pt'
make[2]: Leaving directory '/scratch/build/ompi/mca/osc/pt2pt'
Making install in mca/osc/rdma
make[2]: Entering directory '/scratch/build/ompi/mca/osc/rdma'
  CC       osc_rdma_module.lo
  CC       osc_rdma_comm.lo
  CC       osc_rdma_accumulate.lo
  CC       osc_rdma_component.lo
  CC       osc_rdma_frag.lo
  CC       osc_rdma_request.lo
  CC       osc_rdma_active_target.lo
  CC       osc_rdma_passive_target.lo
  CC       osc_rdma_peer.lo
  CC       osc_rdma_dynamic.lo
  CC       osc_rdma_sync.lo
  CCLD     mca_osc_rdma.la
make[3]: Entering directory '/scratch/build/ompi/mca/osc/rdma'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_osc_rdma.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_osc_rdma.la'
libtool: install: (cd /scratch/build/ompi/mca/osc/rdma; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_osc_rdma.la -rpath /opt/openmpi/lib/openmpi osc_rdma_module.lo osc_rdma_comm.lo osc_rdma_accumulate.lo osc_rdma_component.lo osc_rdma_frag.lo osc_rdma_request.lo osc_rdma_active_target.lo osc_rdma_passive_target.lo osc_rdma_peer.lo osc_rdma_dynamic.lo osc_rdma_sync.lo ../../../../ompi/libmpi.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_osc_rdma.soT /opt/openmpi/lib/openmpi/mca_osc_rdma.so
libtool: install: /usr/bin/install -c .libs/mca_osc_rdma.lai /opt/openmpi/lib/openmpi/mca_osc_rdma.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/ompi/mca/osc/rdma'
make[2]: Leaving directory '/scratch/build/ompi/mca/osc/rdma'
Making install in mca/pml/cm
make[2]: Entering directory '/scratch/build/ompi/mca/pml/cm'
  CC       pml_cm.lo
  CC       pml_cm_cancel.lo
  CC       pml_cm_recvreq.lo
  CC       pml_cm_component.lo
  CC       pml_cm_request.lo
  CC       pml_cm_sendreq.lo
  CC       pml_cm_start.lo
  CCLD     mca_pml_cm.la
make[3]: Entering directory '/scratch/build/ompi/mca/pml/cm'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_pml_cm.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_pml_cm.la'
libtool: install: (cd /scratch/build/ompi/mca/pml/cm; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_pml_cm.la -rpath /opt/openmpi/lib/openmpi pml_cm.lo pml_cm_cancel.lo pml_cm_component.lo pml_cm_recvreq.lo pml_cm_request.lo pml_cm_sendreq.lo pml_cm_start.lo ../../../../ompi/libmpi.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_pml_cm.soT /opt/openmpi/lib/openmpi/mca_pml_cm.so
libtool: install: /usr/bin/install -c .libs/mca_pml_cm.lai /opt/openmpi/lib/openmpi/mca_pml_cm.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/ompi/mca/pml/cm'
make[2]: Leaving directory '/scratch/build/ompi/mca/pml/cm'
Making install in mca/pml/monitoring
make[2]: Entering directory '/scratch/build/ompi/mca/pml/monitoring'
  CC       pml_monitoring_comm.lo
  CC       pml_monitoring_component.lo
  CC       pml_monitoring_iprobe.lo
  CC       pml_monitoring_irecv.lo
  CC       pml_monitoring_isend.lo
  CC       pml_monitoring_start.lo
  CCLD     mca_pml_monitoring.la
make[3]: Entering directory '/scratch/build/ompi/mca/pml/monitoring'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_pml_monitoring.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_pml_monitoring.la'
libtool: install: (cd /scratch/build/ompi/mca/pml/monitoring; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_pml_monitoring.la -rpath /opt/openmpi/lib/openmpi pml_monitoring_comm.lo pml_monitoring_component.lo pml_monitoring_iprobe.lo pml_monitoring_irecv.lo pml_monitoring_isend.lo pml_monitoring_start.lo ../../../../ompi/libmpi.la /scratch/build/ompi/mca/common/monitoring/libmca_common_monitoring.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_pml_monitoring.soT /opt/openmpi/lib/openmpi/mca_pml_monitoring.so
libtool: install: /usr/bin/install -c .libs/mca_pml_monitoring.lai /opt/openmpi/lib/openmpi/mca_pml_monitoring.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/ompi/mca/pml/monitoring'
make[2]: Leaving directory '/scratch/build/ompi/mca/pml/monitoring'
Making install in mca/pml/ob1
make[2]: Entering directory '/scratch/build/ompi/mca/pml/ob1'
  CC       pml_ob1.lo
  CC       pml_ob1_comm.lo
  CC       pml_ob1_component.lo
  CC       pml_ob1_iprobe.lo
  CC       pml_ob1_irecv.lo
  CC       pml_ob1_isend.lo
  CC       pml_ob1_progress.lo
  CC       pml_ob1_rdma.lo
  CC       pml_ob1_rdmafrag.lo
  CC       pml_ob1_recvfrag.lo
  CC       pml_ob1_recvreq.lo
  CC       pml_ob1_sendreq.lo
  CC       pml_ob1_start.lo
  CC       pml_ob1_cuda.lo
  CCLD     mca_pml_ob1.la
make[3]: Entering directory '/scratch/build/ompi/mca/pml/ob1'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ../../../../../openmpi/ompi/mca/pml/ob1/help-mpi-pml-ob1.txt '/opt/openmpi/share/openmpi'
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_pml_ob1.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_pml_ob1.la'
libtool: install: (cd /scratch/build/ompi/mca/pml/ob1; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_pml_ob1.la -rpath /opt/openmpi/lib/openmpi pml_ob1.lo pml_ob1_comm.lo pml_ob1_component.lo pml_ob1_iprobe.lo pml_ob1_irecv.lo pml_ob1_isend.lo pml_ob1_progress.lo pml_ob1_rdma.lo pml_ob1_rdmafrag.lo pml_ob1_recvfrag.lo pml_ob1_recvreq.lo pml_ob1_sendreq.lo pml_ob1_start.lo pml_ob1_cuda.lo ../../../../ompi/libmpi.la /scratch/build/opal/mca/common/cuda/libmca_common_cuda.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_pml_ob1.soT /opt/openmpi/lib/openmpi/mca_pml_ob1.so
libtool: install: /usr/bin/install -c .libs/mca_pml_ob1.lai /opt/openmpi/lib/openmpi/mca_pml_ob1.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/ompi/mca/pml/ob1'
make[2]: Leaving directory '/scratch/build/ompi/mca/pml/ob1'
Making install in mca/sharedfp/individual
make[2]: Entering directory '/scratch/build/ompi/mca/sharedfp/individual'
  CC       sharedfp_individual.lo
  CC       sharedfp_individual_component.lo
  CC       sharedfp_individual_seek.lo
  CC       sharedfp_individual_get_position.lo
  CC       sharedfp_individual_collaborate_data.lo
  CC       sharedfp_individual_write.lo
  CC       sharedfp_individual_iwrite.lo
  CC       sharedfp_individual_read.lo
  CC       sharedfp_individual_insert_metadata.lo
  CC       sharedfp_individual_file_open.lo
  CC       sharedfp_individual_gettime.lo
  CCLD     mca_sharedfp_individual.la
make[3]: Entering directory '/scratch/build/ompi/mca/sharedfp/individual'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_sharedfp_individual.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_sharedfp_individual.la'
libtool: install: (cd /scratch/build/ompi/mca/sharedfp/individual; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_sharedfp_individual.la -rpath /opt/openmpi/lib/openmpi sharedfp_individual.lo sharedfp_individual_component.lo sharedfp_individual_seek.lo sharedfp_individual_get_position.lo sharedfp_individual_collaborate_data.lo sharedfp_individual_write.lo sharedfp_individual_iwrite.lo sharedfp_individual_read.lo sharedfp_individual_insert_metadata.lo sharedfp_individual_file_open.lo sharedfp_individual_gettime.lo /scratch/build/ompi/mca/common/ompio/libmca_common_ompio.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_sharedfp_individual.soT /opt/openmpi/lib/openmpi/mca_sharedfp_individual.so
libtool: install: /usr/bin/install -c .libs/mca_sharedfp_individual.lai /opt/openmpi/lib/openmpi/mca_sharedfp_individual.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/ompi/mca/sharedfp/individual'
make[2]: Leaving directory '/scratch/build/ompi/mca/sharedfp/individual'
Making install in mca/sharedfp/lockedfile
make[2]: Entering directory '/scratch/build/ompi/mca/sharedfp/lockedfile'
  CC       sharedfp_lockedfile.lo
  CC       sharedfp_lockedfile_component.lo
  CC       sharedfp_lockedfile_seek.lo
  CC       sharedfp_lockedfile_get_position.lo
  CC       sharedfp_lockedfile_request_position.lo
  CC       sharedfp_lockedfile_write.lo
  CC       sharedfp_lockedfile_iwrite.lo
  CC       sharedfp_lockedfile_read.lo
  CC       sharedfp_lockedfile_iread.lo
  CC       sharedfp_lockedfile_file_open.lo
  CCLD     mca_sharedfp_lockedfile.la
make[3]: Entering directory '/scratch/build/ompi/mca/sharedfp/lockedfile'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_sharedfp_lockedfile.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_sharedfp_lockedfile.la'
libtool: install: (cd /scratch/build/ompi/mca/sharedfp/lockedfile; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_sharedfp_lockedfile.la -rpath /opt/openmpi/lib/openmpi sharedfp_lockedfile.lo sharedfp_lockedfile_component.lo sharedfp_lockedfile_seek.lo sharedfp_lockedfile_get_position.lo sharedfp_lockedfile_request_position.lo sharedfp_lockedfile_write.lo sharedfp_lockedfile_iwrite.lo sharedfp_lockedfile_read.lo sharedfp_lockedfile_iread.lo sharedfp_lockedfile_file_open.lo /scratch/build/ompi/mca/common/ompio/libmca_common_ompio.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_sharedfp_lockedfile.soT /opt/openmpi/lib/openmpi/mca_sharedfp_lockedfile.so
libtool: install: /usr/bin/install -c .libs/mca_sharedfp_lockedfile.lai /opt/openmpi/lib/openmpi/mca_sharedfp_lockedfile.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/ompi/mca/sharedfp/lockedfile'
make[2]: Leaving directory '/scratch/build/ompi/mca/sharedfp/lockedfile'
Making install in mca/sharedfp/sm
make[2]: Entering directory '/scratch/build/ompi/mca/sharedfp/sm'
  CC       sharedfp_sm.lo
  CC       sharedfp_sm_component.lo
  CC       sharedfp_sm_seek.lo
  CC       sharedfp_sm_get_position.lo
  CC       sharedfp_sm_request_position.lo
  CC       sharedfp_sm_write.lo
  CC       sharedfp_sm_iwrite.lo
  CC       sharedfp_sm_read.lo
  CC       sharedfp_sm_iread.lo
  CC       sharedfp_sm_file_open.lo
  CCLD     mca_sharedfp_sm.la
make[3]: Entering directory '/scratch/build/ompi/mca/sharedfp/sm'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_sharedfp_sm.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_sharedfp_sm.la'
libtool: install: (cd /scratch/build/ompi/mca/sharedfp/sm; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_sharedfp_sm.la -rpath /opt/openmpi/lib/openmpi sharedfp_sm.lo sharedfp_sm_component.lo sharedfp_sm_seek.lo sharedfp_sm_get_position.lo sharedfp_sm_request_position.lo sharedfp_sm_write.lo sharedfp_sm_iwrite.lo sharedfp_sm_read.lo sharedfp_sm_iread.lo sharedfp_sm_file_open.lo /scratch/build/ompi/mca/common/ompio/libmca_common_ompio.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_sharedfp_sm.soT /opt/openmpi/lib/openmpi/mca_sharedfp_sm.so
libtool: install: /usr/bin/install -c .libs/mca_sharedfp_sm.lai /opt/openmpi/lib/openmpi/mca_sharedfp_sm.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/ompi/mca/sharedfp/sm'
make[2]: Leaving directory '/scratch/build/ompi/mca/sharedfp/sm'
Making install in mca/topo/basic
make[2]: Entering directory '/scratch/build/ompi/mca/topo/basic'
  CC       topo_basic_component.lo
  CCLD     mca_topo_basic.la
make[3]: Entering directory '/scratch/build/ompi/mca/topo/basic'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_topo_basic.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_topo_basic.la'
libtool: install: (cd /scratch/build/ompi/mca/topo/basic; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_topo_basic.la -rpath /opt/openmpi/lib/openmpi topo_basic_component.lo ../../../../ompi/libmpi.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_topo_basic.soT /opt/openmpi/lib/openmpi/mca_topo_basic.so
libtool: install: /usr/bin/install -c .libs/mca_topo_basic.lai /opt/openmpi/lib/openmpi/mca_topo_basic.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/ompi/mca/topo/basic'
make[2]: Leaving directory '/scratch/build/ompi/mca/topo/basic'
Making install in mca/topo/treematch
make[2]: Entering directory '/scratch/build/ompi/mca/topo/treematch'
  CC       topo_treematch_module.lo
  CC       topo_treematch_component.lo
  CC       topo_treematch_dist_graph_create.lo
  CC       treematch/IntConstantInitializedVector.lo
  CC       treematch/tm_mt.lo
  CC       treematch/fibo.lo
  CC       treematch/tm_thread_pool.lo
  CC       treematch/tm_verbose.lo
  CC       treematch/tm_malloc.lo
  CC       treematch/tm_mapping.lo
  CC       treematch/tm_timings.lo
  CC       treematch/tm_bucket.lo
  CC       treematch/tm_tree.lo
  CC       treematch/tm_topology.lo
  CC       treematch/tm_kpartitioning.lo
  CC       treematch/tm_solution.lo
  CC       treematch/k-partitioning.lo
  CC       treematch/PriorityQueue.lo
  CCLD     mca_topo_treematch.la
make[3]: Entering directory '/scratch/build/ompi/mca/topo/treematch'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_topo_treematch.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_topo_treematch.la'
libtool: install: (cd /scratch/build/ompi/mca/topo/treematch; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_topo_treematch.la -rpath /opt/openmpi/lib/openmpi topo_treematch_module.lo topo_treematch_component.lo topo_treematch_dist_graph_create.lo treematch/IntConstantInitializedVector.lo treematch/tm_mt.lo treematch/fibo.lo treematch/tm_thread_pool.lo treematch/tm_verbose.lo treematch/tm_malloc.lo treematch/tm_mapping.lo treematch/tm_timings.lo treematch/tm_bucket.lo treematch/tm_tree.lo treematch/tm_topology.lo treematch/tm_kpartitioning.lo treematch/tm_solution.lo treematch/k-partitioning.lo treematch/PriorityQueue.lo ../../../../ompi/libmpi.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_topo_treematch.soT /opt/openmpi/lib/openmpi/mca_topo_treematch.so
libtool: install: /usr/bin/install -c .libs/mca_topo_treematch.lai /opt/openmpi/lib/openmpi/mca_topo_treematch.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/ompi/mca/topo/treematch'
make[2]: Leaving directory '/scratch/build/ompi/mca/topo/treematch'
Making install in mca/vprotocol/pessimist
make[2]: Entering directory '/scratch/build/ompi/mca/vprotocol/pessimist'
  CC       vprotocol_pessimist.lo
  CC       vprotocol_pessimist_component.lo
  CC       vprotocol_pessimist_request.lo
  CC       vprotocol_pessimist_proc.lo
  CC       vprotocol_pessimist_comm.lo
  CC       vprotocol_pessimist_progress.lo
  CC       vprotocol_pessimist_start.lo
  CC       vprotocol_pessimist_recv.lo
  CC       vprotocol_pessimist_send.lo
  CC       vprotocol_pessimist_probe.lo
  CC       vprotocol_pessimist_wait.lo
  CC       vprotocol_pessimist_event.lo
  CC       vprotocol_pessimist_eventlog.lo
  CC       vprotocol_pessimist_sender_based.lo
  CCLD     mca_vprotocol_pessimist.la
make[3]: Entering directory '/scratch/build/ompi/mca/vprotocol/pessimist'
make[3]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib/openmpi'
 /bin/bash ../../../../libtool   --mode=install /usr/bin/install -c   mca_vprotocol_pessimist.la '/opt/openmpi/lib/openmpi'
libtool: warning: relinking 'mca_vprotocol_pessimist.la'
libtool: install: (cd /scratch/build/ompi/mca/vprotocol/pessimist; /bin/bash "/scratch/build/libtool"  --silent --tag CC --mode=relink gcc -O3 -DNDEBUG -w -finline-functions -fno-strict-aliasing -mcx16 -pthread -module -avoid-version -o mca_vprotocol_pessimist.la -rpath /opt/openmpi/lib/openmpi vprotocol_pessimist.lo vprotocol_pessimist_component.lo vprotocol_pessimist_request.lo vprotocol_pessimist_proc.lo vprotocol_pessimist_comm.lo vprotocol_pessimist_progress.lo vprotocol_pessimist_start.lo vprotocol_pessimist_recv.lo vprotocol_pessimist_send.lo vprotocol_pessimist_probe.lo vprotocol_pessimist_wait.lo vprotocol_pessimist_event.lo vprotocol_pessimist_eventlog.lo vprotocol_pessimist_sender_based.lo ../../../../ompi/libmpi.la -lrt -lm -lutil )
libtool: install: /usr/bin/install -c .libs/mca_vprotocol_pessimist.soT /opt/openmpi/lib/openmpi/mca_vprotocol_pessimist.so
libtool: install: /usr/bin/install -c .libs/mca_vprotocol_pessimist.lai /opt/openmpi/lib/openmpi/mca_vprotocol_pessimist.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/ompi/mca/vprotocol/pessimist'
make[2]: Leaving directory '/scratch/build/ompi/mca/vprotocol/pessimist'
Making install in contrib/libompitrace
make[2]: Entering directory '/scratch/build/ompi/contrib/libompitrace'
  CC       abort.lo
  CC       accumulate.lo
  CC       add_error_class.lo
  CC       add_error_code.lo
  CC       add_error_string.lo
  CC       allgather.lo
  CC       allgatherv.lo
  CC       alloc_mem.lo
  CC       allreduce.lo
  CC       bcast.lo
  CC       get_address.lo
  CC       barrier.lo
  CC       init.lo
  CC       finalize.lo
  CC       isend.lo
  CC       recv.lo
  CC       reduce.lo
  CC       request_free.lo
  CC       send.lo
  CC       sendrecv.lo
  CCLD     libompitrace.la
make[3]: Entering directory '/scratch/build/ompi/contrib/libompitrace'
make[3]: Nothing to be done for 'install-data-am'.
 /usr/bin/mkdir -p '/opt/openmpi/lib'
 /bin/bash ../../../libtool   --mode=install /usr/bin/install -c   libompitrace.la '/opt/openmpi/lib'
libtool: install: /usr/bin/install -c .libs/libompitrace.so.40.20.0 /opt/openmpi/lib/libompitrace.so.40.20.0
libtool: install: (cd /opt/openmpi/lib && { ln -s -f libompitrace.so.40.20.0 libompitrace.so.40 || { rm -f libompitrace.so.40 && ln -s libompitrace.so.40.20.0 libompitrace.so.40; }; })
libtool: install: (cd /opt/openmpi/lib && { ln -s -f libompitrace.so.40.20.0 libompitrace.so || { rm -f libompitrace.so && ln -s libompitrace.so.40.20.0 libompitrace.so; }; })
libtool: install: /usr/bin/install -c .libs/libompitrace.lai /opt/openmpi/lib/libompitrace.la
libtool: finish: PATH="/opt/cmake/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /opt/openmpi/lib
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/openmpi/lib

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory '/scratch/build/ompi/contrib/libompitrace'
make[2]: Leaving directory '/scratch/build/ompi/contrib/libompitrace'
Making install in tools/ompi_info
make[2]: Entering directory '/scratch/build/ompi/tools/ompi_info'
  CC       ompi_info.o
  CC       param.o
  GENERATE ompi_info.1
  CCLD     ompi_info
make[3]: Entering directory '/scratch/build/ompi/tools/ompi_info'
 /usr/bin/mkdir -p '/opt/openmpi/bin'
 /usr/bin/mkdir -p '/opt/openmpi/share/man/man1'
  /bin/bash ../../../libtool   --mode=install /usr/bin/install -c ompi_info '/opt/openmpi/bin'
 /usr/bin/install -c -m 644 ompi_info.1 '/opt/openmpi/share/man/man1'
libtool: install: /usr/bin/install -c .libs/ompi_info /opt/openmpi/bin/ompi_info
make[3]: Leaving directory '/scratch/build/ompi/tools/ompi_info'
make[2]: Leaving directory '/scratch/build/ompi/tools/ompi_info'
Making install in tools/wrappers
make[2]: Entering directory '/scratch/build/ompi/tools/wrappers'
(cd ../../../opal/tools/wrappers && make  generic_wrapper.1)
rm -f mpif77.1
sed -e 's/#PROJECT#/Open MPI/g' \
    -e 's/#PROJECT_SHORT#/OMPI/g' \
    -e 's/#LANGUAGE#/Fortran/g' \
    -e 's/#PACKAGE_NAME#/Open MPI/g' \
    -e 's/#PACKAGE_VERSION#/4.0.2/g' \
    -e 's/#OMPI_DATE#/Oct 07, 2019/g' \
    < ../../../../openmpi/ompi/tools/wrappers/mpif77.1in > mpif77.1
  LN_S     mpif90.1
make[3]: Entering directory '/scratch/build/opal/tools/wrappers'
  GENERATE generic_wrapper.1
make[3]: Leaving directory '/scratch/build/opal/tools/wrappers'
rm -f mpicc.1
rm -f mpic++.1
rm -f mpicxx.1
rm -f mpifort.1
sed -e 's/#COMMAND#/mpic++/g' -e 's/#PROJECT#/Open MPI/g' -e 's/#PROJECT_SHORT#/OMPI/g' -e 's/#LANGUAGE#/C++/g' < ../../../opal/tools/wrappers/generic_wrapper.1 > mpic++.1
sed -e 's/#COMMAND#/mpicxx/g' -e 's/#PROJECT#/Open MPI/g' -e 's/#PROJECT_SHORT#/OMPI/g' -e 's/#LANGUAGE#/C++/g' < ../../../opal/tools/wrappers/generic_wrapper.1 > mpicxx.1
sed -e 's/#COMMAND#/mpicc/g' -e 's/#PROJECT#/Open MPI/g' -e 's/#PROJECT_SHORT#/OMPI/g' -e 's/#LANGUAGE#/C/g' < ../../../opal/tools/wrappers/generic_wrapper.1 > mpicc.1
sed -e 's/#COMMAND#/mpifort/g' -e 's/#PROJECT#/Open MPI/g' -e 's/#PROJECT_SHORT#/OMPI/g' -e 's/#LANGUAGE#/Fortran/g' < ../../../opal/tools/wrappers/generic_wrapper.1 > mpifort.1
make[3]: Entering directory '/scratch/build/ompi/tools/wrappers'
 /usr/bin/mkdir -p '/opt/openmpi/share/openmpi'
 /usr/bin/mkdir -p '/opt/openmpi/share/man/man1'
 /usr/bin/mkdir -p '/opt/openmpi/lib/pkgconfig'
make  install-exec-hook
 /usr/bin/install -c -m 644 mpicc-wrapper-data.txt mpic++-wrapper-data.txt mpifort-wrapper-data.txt '/opt/openmpi/share/openmpi'
 /usr/bin/install -c -m 644 ompi.pc ompi-c.pc ompi-cxx.pc ompi-fort.pc '/opt/openmpi/lib/pkgconfig'
 /usr/bin/install -c -m 644 mpicc.1 mpic++.1 mpicxx.1 mpifort.1 mpif77.1 mpif90.1 '/opt/openmpi/share/man/man1'
make  install-data-hook
make[4]: Entering directory '/scratch/build/ompi/tools/wrappers'
test -z "/opt/openmpi/bin" || /usr/bin/mkdir -p "/opt/openmpi/bin"
(cd /opt/openmpi/bin; rm -f mpicc; ln -s opal_wrapper mpicc)
(cd /opt/openmpi/bin; rm -f mpic++; ln -s opal_wrapper mpic++)
make[4]: Entering directory '/scratch/build/ompi/tools/wrappers'
(cd /opt/openmpi/share/openmpi; rm -f mpicxx-wrapper-data.txt; ln -s mpic++-wrapper-data.txt mpicxx-wrapper-data.txt)
(cd /opt/openmpi/bin; rm -f mpicxx; ln -s opal_wrapper mpicxx)
(cd /opt/openmpi/share/openmpi; rm -f mpif77-wrapper-data.txt; ln -s mpifort-wrapper-data.txt mpif77-wrapper-data.txt)
(cd /opt/openmpi/bin; rm -f mpifort; ln -s opal_wrapper mpifort)
(cd /opt/openmpi/share/openmpi; rm -f mpif90-wrapper-data.txt; ln -s mpifort-wrapper-data.txt mpif90-wrapper-data.txt)
(cd /opt/openmpi/bin; rm -f mpif77; ln -s opal_wrapper mpif77)
(cd /opt/openmpi/lib/pkgconfig; rm -f ompi-f77.pc; ln -s ompi-fort.pc ompi-f77.pc)
(cd /opt/openmpi/bin; rm -f mpif90; ln -s opal_wrapper mpif90)
(cd /opt/openmpi/lib/pkgconfig; rm -f ompi-f90.pc; ln -s ompi-fort.pc ompi-f90.pc)
(cd /opt/openmpi/bin; rm -f mpiCC; ln -s opal_wrapper mpiCC)
(cd /opt/openmpi/share/openmpi; rm -f mpiCC-wrapper-data.txt; ln -s mpic++-wrapper-data.txt mpiCC-wrapper-data.txt)
make[4]: Leaving directory '/scratch/build/ompi/tools/wrappers'
(cd /opt/openmpi/share/man/man1; rm -f mpiCC.1; ln -s mpic++.1 mpiCC.1)
make[4]: Leaving directory '/scratch/build/ompi/tools/wrappers'
make[3]: Leaving directory '/scratch/build/ompi/tools/wrappers'
make[2]: Leaving directory '/scratch/build/ompi/tools/wrappers'
Making install in tools/mpisync
make[2]: Entering directory '/scratch/build/ompi/tools/mpisync'
make[3]: Entering directory '/scratch/build/ompi/tools/mpisync'
make  install-data-hook
make[4]: Entering directory '/scratch/build/ompi/tools/mpisync'
make[4]: Nothing to be done for 'install-data-hook'.
make[4]: Leaving directory '/scratch/build/ompi/tools/mpisync'
make[3]: Leaving directory '/scratch/build/ompi/tools/mpisync'
make[2]: Leaving directory '/scratch/build/ompi/tools/mpisync'
make[1]: Leaving directory '/scratch/build/ompi'
Making install in test
make[1]: Entering directory '/scratch/build/test'
Making install in support
make[2]: Entering directory '/scratch/build/test/support'
make[3]: Entering directory '/scratch/build/test/support'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Nothing to be done for 'install-data-am'.
make[3]: Leaving directory '/scratch/build/test/support'
make[2]: Leaving directory '/scratch/build/test/support'
Making install in asm
make[2]: Entering directory '/scratch/build/test/asm'
make[3]: Entering directory '/scratch/build/test/asm'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Nothing to be done for 'install-data-am'.
make[3]: Leaving directory '/scratch/build/test/asm'
make[2]: Leaving directory '/scratch/build/test/asm'
Making install in class
make[2]: Entering directory '/scratch/build/test/class'
make[3]: Entering directory '/scratch/build/test/class'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Nothing to be done for 'install-data-am'.
make[3]: Leaving directory '/scratch/build/test/class'
make[2]: Leaving directory '/scratch/build/test/class'
Making install in threads
make[2]: Entering directory '/scratch/build/test/threads'
make[3]: Entering directory '/scratch/build/test/threads'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Nothing to be done for 'install-data-am'.
make[3]: Leaving directory '/scratch/build/test/threads'
make[2]: Leaving directory '/scratch/build/test/threads'
Making install in datatype
make[2]: Entering directory '/scratch/build/test/datatype'
make[3]: Entering directory '/scratch/build/test/datatype'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Nothing to be done for 'install-data-am'.
make[3]: Leaving directory '/scratch/build/test/datatype'
make[2]: Leaving directory '/scratch/build/test/datatype'
Making install in util
make[2]: Entering directory '/scratch/build/test/util'
make[3]: Entering directory '/scratch/build/test/util'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Nothing to be done for 'install-data-am'.
make[3]: Leaving directory '/scratch/build/test/util'
make[2]: Leaving directory '/scratch/build/test/util'
Making install in dss
make[2]: Entering directory '/scratch/build/test/dss'
make[3]: Entering directory '/scratch/build/test/dss'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Nothing to be done for 'install-data-am'.
make[3]: Leaving directory '/scratch/build/test/dss'
make[2]: Leaving directory '/scratch/build/test/dss'
Making install in mpool
make[2]: Entering directory '/scratch/build/test/mpool'
make[3]: Entering directory '/scratch/build/test/mpool'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Nothing to be done for 'install-data-am'.
make[3]: Leaving directory '/scratch/build/test/mpool'
make[2]: Leaving directory '/scratch/build/test/mpool'
Making install in monitoring
make[2]: Entering directory '/scratch/build/test/monitoring'
  CC       monitoring_test.o
  CC       test_overhead.o
  CC       test_pvar_access.o
  CC       check_monitoring.o
  CC       example_reduce_count.o
  CCLD     monitoring_test
  CCLD     test_overhead
  CCLD     test_pvar_access
  CCLD     example_reduce_count
  CCLD     check_monitoring
make[3]: Entering directory '/scratch/build/test/monitoring'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Nothing to be done for 'install-data-am'.
make[3]: Leaving directory '/scratch/build/test/monitoring'
make[2]: Leaving directory '/scratch/build/test/monitoring'
Making install in spc
make[2]: Entering directory '/scratch/build/test/spc'
  CC       spc_test.o
  CCLD     spc_test
make[3]: Entering directory '/scratch/build/test/spc'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Nothing to be done for 'install-data-am'.
make[3]: Leaving directory '/scratch/build/test/spc'
make[2]: Leaving directory '/scratch/build/test/spc'
make[2]: Entering directory '/scratch/build/test'
make[3]: Entering directory '/scratch/build/test'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Nothing to be done for 'install-data-am'.
make[3]: Leaving directory '/scratch/build/test'
make[2]: Leaving directory '/scratch/build/test'
make[1]: Leaving directory '/scratch/build/test'
make[1]: Entering directory '/scratch/build'
make[2]: Entering directory '/scratch/build'
make  install-exec-hook
make[2]: Nothing to be done for 'install-data-am'.
make[3]: Entering directory '/scratch/build'
make[3]: Leaving directory '/scratch/build'
make[2]: Leaving directory '/scratch/build'
make[1]: Leaving directory '/scratch/build'
Selecting previously unselected package intel-oneapi-tbb-common-2021.8.0.
Preparing to unpack .../16-intel-oneapi-tbb-common-2021.8.0_2021.8.0-25334_all.deb ...
Unpacking intel-oneapi-tbb-common-2021.8.0 (2021.8.0-25334) ...
Selecting previously unselected package intel-oneapi-tbb-2021.8.0.
Preparing to unpack .../17-intel-oneapi-tbb-2021.8.0_2021.8.0-25334_amd64.deb ...
Unpacking intel-oneapi-tbb-2021.8.0 (2021.8.0-25334) ...
Selecting previously unselected package intel-oneapi-compiler-dpcpp-cpp-runtime-2023.0.0.
Preparing to unpack .../18-intel-oneapi-compiler-dpcpp-cpp-runtime-2023.0.0_2023.0.0-25370_amd64.deb ...
Unpacking intel-oneapi-compiler-dpcpp-cpp-runtime-2023.0.0 (2023.0.0-25370) ...
Selecting previously unselected package intel-oneapi-compiler-shared-common-2023.0.0.
Preparing to unpack .../19-intel-oneapi-compiler-shared-common-2023.0.0_2023.0.0-25370_all.deb ...
Unpacking intel-oneapi-compiler-shared-common-2023.0.0 (2023.0.0-25370) ...
 ---> Removed intermediate container 82e0ab48e4bf
 ---> 79da27e5a3c1
Step 12/25 : ENV PATH=${OPENMPI_DIR}/bin:$PATH
 ---> Running in 2b5689c78d91
 ---> Removed intermediate container 2b5689c78d91
 ---> 2f45240eccd8
Step 13/25 : ARG KOKKOS_VERSION=4.1.00
 ---> Running in ffff06aba549
 ---> Removed intermediate container ffff06aba549
 ---> 4fa8b941603f
Step 14/25 : ENV KOKKOS_DIR=/opt/kokkos
 ---> Running in 0a04572685bc
 ---> Removed intermediate container 0a04572685bc
 ---> d68b2c1fde9d
Step 15/25 : RUN KOKKOS_URL=https://github.com/kokkos/kokkos/archive/${KOKKOS_VERSION}.tar.gz &&     KOKKOS_ARCHIVE=kokkos-${KOKKOS_VERSION}.tar.gz &&     SCRATCH_DIR=/scratch && mkdir -p ${SCRATCH_DIR} && cd ${SCRATCH_DIR} &&     wget --quiet ${KOKKOS_URL} --output-document=${KOKKOS_ARCHIVE} &&     mkdir -p kokkos &&     tar -xf ${KOKKOS_ARCHIVE} -C kokkos --strip-components=1 &&     cd kokkos &&     mkdir -p build && cd build &&     cmake       -D CMAKE_INSTALL_PREFIX=${KOKKOS_DIR}       -D CMAKE_BUILD_TYPE=Debug       -D CMAKE_CXX_COMPILER=${SCRATCH_DIR}/kokkos/bin/nvcc_wrapper       -D Kokkos_ENABLE_CUDA=ON       -D Kokkos_ENABLE_CUDA_LAMBDA=ON       -D Kokkos_ARCH_VOLTA70=ON     .. &&     make -j${NPROCS} install &&     rm -rf ${SCRATCH_DIR}
 ---> Running in f592ad746c19
Selecting previously unselected package intel-oneapi-dpcpp-debugger-eclipse-cfg.
Preparing to unpack .../20-intel-oneapi-dpcpp-debugger-eclipse-cfg_2023.1.0-43513_all.deb ...
Unpacking intel-oneapi-dpcpp-debugger-eclipse-cfg (2023.1.0-43513) ...
Selecting previously unselected package intel-oneapi-dpcpp-debugger-2023.0.0.
Preparing to unpack .../21-intel-oneapi-dpcpp-debugger-2023.0.0_2023.0.0-25336_amd64.deb ...
Unpacking intel-oneapi-dpcpp-debugger-2023.0.0 (2023.0.0-25336) ...
-- Setting default Kokkos CXX standard to 17
-- The CXX compiler identification is GNU 9.4.0
-- Check for working CXX compiler: /scratch/kokkos/bin/nvcc_wrapper
-- Check for working CXX compiler: /scratch/kokkos/bin/nvcc_wrapper -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Kokkos version: 4.1.00
-- The project name is: Kokkos
-- Using internal gtest for testing
-- Compiler Version: 11.0.221
-- SERIAL backend is being turned on to ensure there is at least one Host space. To change this, you must enable another host execution space and configure with -DKokkos_ENABLE_SERIAL=OFF or change CMakeCache.txt
-- Using -std=c++17 for C++17 standard as feature
-- Built-in Execution Spaces:
--     Device Parallel: Kokkos::Cuda
--     Host Parallel: NoTypeDefined
--       Host Serial: SERIAL
-- 
-- Architectures:
--  VOLTA70
-- Found CUDAToolkit: /usr/local/cuda/include (found version "11.0.221") 
-- Looking for C++ include pthread.h
-- Looking for C++ include pthread.h - found
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Success
-- Found Threads: TRUE  
-- Found TPLCUDA: TRUE  
-- Found TPLLIBDL: /usr/include  
-- Using internal desul_atomics copy
-- Kokkos Devices: CUDA;SERIAL, Kokkos Backends: CUDA;SERIAL
-- Configuring done
-- Generating done
-- Build files have been written to: /scratch/kokkos/build
Scanning dependencies of target AlwaysCheckGit
Scanning dependencies of target kokkossimd
[  3%] Building CXX object simd/src/CMakeFiles/kokkossimd.dir/Kokkos_SIMD_dummy.cpp.o
[  3%] Built target AlwaysCheckGit
Scanning dependencies of target impl_git_version
[  6%] Building CXX object CMakeFiles/impl_git_version.dir/generated/Kokkos_Version_Info.cpp.o
nvcc_wrapper has been given GNU extension standard flag -std=gnu++11 - reverting flag to -std=c++11
Scanning dependencies of target kokkoscore
[ 10%] Building CXX object core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_CPUDiscovery.cpp.o
[ 13%] Building CXX object core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_Command_Line_Parsing.cpp.o
[ 17%] Linking CXX static library libkokkossimd.a
[ 17%] Built target kokkossimd
[ 20%] Building CXX object core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_Core.cpp.o
[ 24%] Linking CXX static library libimpl_git_version.a
[ 24%] Built target impl_git_version
[ 27%] Building CXX object core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_Error.cpp.o
[ 31%] Building CXX object core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_ExecPolicy.cpp.o
[ 34%] Building CXX object core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_HostBarrier.cpp.o
[ 37%] Building CXX object core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_HostSpace.cpp.o
[ 41%] Building CXX object core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_HostSpace_deepcopy.cpp.o
[ 44%] Building CXX object core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_HostThreadTeam.cpp.o
[ 48%] Building CXX object core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_MemoryPool.cpp.o
[ 51%] Building CXX object core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_MemorySpace.cpp.o
[ 55%] Building CXX object core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_Profiling.cpp.o
[ 58%] Building CXX object core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_SharedAlloc.cpp.o
[ 62%] Building CXX object core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_Spinwait.cpp.o
[ 65%] Building CXX object core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_Stacktrace.cpp.o
Selecting previously unselected package intel-oneapi-compiler-shared-2023.0.0.
Preparing to unpack .../22-intel-oneapi-compiler-shared-2023.0.0_2023.0.0-25370_amd64.deb ...
Unpacking intel-oneapi-compiler-shared-2023.0.0 (2023.0.0-25370) ...
[ 68%] Building CXX object core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_hwloc.cpp.o
[ 72%] Building CXX object core/src/CMakeFiles/kokkoscore.dir/Cuda/Kokkos_CudaSpace.cpp.o
[ 75%] Building CXX object core/src/CMakeFiles/kokkoscore.dir/Cuda/Kokkos_Cuda_Instance.cpp.o
[ 79%] Building CXX object core/src/CMakeFiles/kokkoscore.dir/Cuda/Kokkos_Cuda_Task.cpp.o
Selecting previously unselected package intel-oneapi-tbb-common-devel-2021.8.0.
Preparing to unpack .../23-intel-oneapi-tbb-common-devel-2021.8.0_2021.8.0-25334_all.deb ...
Unpacking intel-oneapi-tbb-common-devel-2021.8.0 (2021.8.0-25334) ...
Selecting previously unselected package intel-oneapi-tbb-devel-2021.8.0.
Preparing to unpack .../24-intel-oneapi-tbb-devel-2021.8.0_2021.8.0-25334_amd64.deb ...
Unpacking intel-oneapi-tbb-devel-2021.8.0 (2021.8.0-25334) ...
Selecting previously unselected package intel-oneapi-dev-utilities-eclipse-cfg.
Preparing to unpack .../25-intel-oneapi-dev-utilities-eclipse-cfg_2021.10.0-49423_all.deb ...
Unpacking intel-oneapi-dev-utilities-eclipse-cfg (2021.10.0-49423) ...
Selecting previously unselected package intel-oneapi-dev-utilities-2021.8.0.
Preparing to unpack .../26-intel-oneapi-dev-utilities-2021.8.0_2021.8.0-25328_amd64.deb ...
Unpacking intel-oneapi-dev-utilities-2021.8.0 (2021.8.0-25328) ...
[ 82%] Building CXX object core/src/CMakeFiles/kokkoscore.dir/Serial/Kokkos_Serial.cpp.o
Selecting previously unselected package intel-oneapi-dpcpp-cpp-2023.0.0.
Preparing to unpack .../27-intel-oneapi-dpcpp-cpp-2023.0.0_2023.0.0-25370_amd64.deb ...
Unpacking intel-oneapi-dpcpp-cpp-2023.0.0 (2023.0.0-25370) ...
[ 86%] Building CXX object core/src/CMakeFiles/kokkoscore.dir/Serial/Kokkos_Serial_Task.cpp.o
[ 89%] Building CXX object core/src/CMakeFiles/kokkoscore.dir/__/__/tpls/desul/src/Lock_Array_CUDA.cpp.o
[ 93%] Linking CXX static library libkokkoscore.a
[ 93%] Built target kokkoscore
Scanning dependencies of target kokkoscontainers
[ 96%] Building CXX object containers/src/CMakeFiles/kokkoscontainers.dir/impl/Kokkos_UnorderedMap_impl.cpp.o
[100%] Linking CXX static library libkokkoscontainers.a
[100%] Built target kokkoscontainers
Install the project...
-- Install configuration: "Debug"
-- Installing: /opt/kokkos/include
-- Installing: /opt/kokkos/include/Kokkos_Crs.hpp
-- Installing: /opt/kokkos/include/Kokkos_HostSpace.hpp
-- Installing: /opt/kokkos/include/Kokkos_AnonymousSpace.hpp
-- Installing: /opt/kokkos/include/Kokkos_Complex.hpp
-- Installing: /opt/kokkos/include/Kokkos_Extents.hpp
-- Installing: /opt/kokkos/include/Kokkos_Rank.hpp
-- Installing: /opt/kokkos/include/KokkosExp_MDRangePolicy.hpp
-- Installing: /opt/kokkos/include/Kokkos_Half.hpp
-- Installing: /opt/kokkos/include/Kokkos_UniqueToken.hpp
-- Installing: /opt/kokkos/include/Kokkos_Atomics_Desul_Config.hpp
-- Installing: /opt/kokkos/include/Kokkos_LogicalSpaces.hpp
-- Installing: /opt/kokkos/include/Kokkos_Atomic.hpp
-- Installing: /opt/kokkos/include/Kokkos_GraphNode.hpp
-- Installing: /opt/kokkos/include/Kokkos_BitManipulation.hpp
-- Installing: /opt/kokkos/include/Kokkos_View.hpp
-- Installing: /opt/kokkos/include/Kokkos_TaskScheduler.hpp
-- Installing: /opt/kokkos/include/Kokkos_ScratchSpace.hpp
-- Installing: /opt/kokkos/include/Serial
-- Installing: /opt/kokkos/include/Serial/Kokkos_Serial_Task.hpp
-- Installing: /opt/kokkos/include/Serial/Kokkos_Serial_Parallel_Range.hpp
-- Installing: /opt/kokkos/include/Serial/Kokkos_Serial.hpp
-- Installing: /opt/kokkos/include/Serial/Kokkos_Serial_UniqueToken.hpp
-- Installing: /opt/kokkos/include/Serial/Kokkos_Serial_ZeroMemset.hpp
-- Installing: /opt/kokkos/include/Serial/Kokkos_Serial_Parallel_MDRange.hpp
-- Installing: /opt/kokkos/include/Serial/Kokkos_Serial_WorkGraphPolicy.hpp
-- Installing: /opt/kokkos/include/Serial/Kokkos_Serial_MDRangePolicy.hpp
-- Installing: /opt/kokkos/include/Serial/Kokkos_Serial_Parallel_Team.hpp
-- Installing: /opt/kokkos/include/Kokkos_ReductionIdentity.hpp
-- Installing: /opt/kokkos/include/Kokkos_Atomics_Desul_Volatile_Wrapper.hpp
-- Installing: /opt/kokkos/include/Kokkos_MathematicalConstants.hpp
-- Installing: /opt/kokkos/include/Kokkos_Timer.hpp
-- Installing: /opt/kokkos/include/Kokkos_Concepts.hpp
-- Installing: /opt/kokkos/include/Kokkos_MemoryTraits.hpp
-- Installing: /opt/kokkos/include/Kokkos_Profiling_ScopedRegion.hpp
-- Installing: /opt/kokkos/include/Threads
-- Installing: /opt/kokkos/include/Threads/Kokkos_Threads_Parallel_MDRange.hpp
-- Installing: /opt/kokkos/include/Threads/Kokkos_Threads.hpp
-- Installing: /opt/kokkos/include/Threads/Kokkos_Threads_Parallel_Range.hpp
-- Installing: /opt/kokkos/include/Threads/Kokkos_ThreadsExec.hpp
-- Installing: /opt/kokkos/include/Threads/Kokkos_ThreadsTeam.hpp
-- Installing: /opt/kokkos/include/Threads/Kokkos_Threads_WorkGraphPolicy.hpp
-- Installing: /opt/kokkos/include/Threads/Kokkos_Threads_UniqueToken.hpp
-- Installing: /opt/kokkos/include/Threads/Kokkos_Threads_MDRangePolicy.hpp
-- Installing: /opt/kokkos/include/Threads/Kokkos_Threads_Parallel_Team.hpp
-- Installing: /opt/kokkos/include/Kokkos_Parallel.hpp
-- Installing: /opt/kokkos/include/Kokkos_Atomics_Desul_Wrapper.hpp
-- Installing: /opt/kokkos/include/Kokkos_DetectionIdiom.hpp
-- Installing: /opt/kokkos/include/Kokkos_PointerOwnership.hpp
-- Installing: /opt/kokkos/include/Kokkos_Profiling_ProfileSection.hpp
-- Installing: /opt/kokkos/include/HPX
-- Installing: /opt/kokkos/include/HPX/Kokkos_HPX_Task.hpp
-- Installing: /opt/kokkos/include/HPX/Kokkos_HPX_MDRangePolicy.hpp
-- Installing: /opt/kokkos/include/HPX/Kokkos_HPX.hpp
-- Installing: /opt/kokkos/include/HPX/Kokkos_HPX_WorkGraphPolicy.hpp
-- Installing: /opt/kokkos/include/Kokkos_CopyViews.hpp
-- Installing: /opt/kokkos/include/traits
-- Installing: /opt/kokkos/include/traits/Kokkos_OccupancyControlTrait.hpp
-- Installing: /opt/kokkos/include/traits/Kokkos_IndexTypeTrait.hpp
-- Installing: /opt/kokkos/include/traits/Kokkos_IterationPatternTrait.hpp
-- Installing: /opt/kokkos/include/traits/Kokkos_Traits_fwd.hpp
-- Installing: /opt/kokkos/include/traits/Kokkos_WorkTagTrait.hpp
-- Installing: /opt/kokkos/include/traits/Kokkos_PolicyTraitAdaptor.hpp
-- Installing: /opt/kokkos/include/traits/Kokkos_LaunchBoundsTrait.hpp
-- Installing: /opt/kokkos/include/traits/Kokkos_ScheduleTrait.hpp
-- Installing: /opt/kokkos/include/traits/Kokkos_GraphKernelTrait.hpp
-- Installing: /opt/kokkos/include/traits/Kokkos_PolicyTraitMatcher.hpp
-- Installing: /opt/kokkos/include/traits/Kokkos_WorkItemPropertyTrait.hpp
-- Installing: /opt/kokkos/include/traits/Kokkos_ExecutionSpaceTrait.hpp
-- Installing: /opt/kokkos/include/KokkosExp_InterOp.hpp
-- Installing: /opt/kokkos/include/Kokkos_hwloc.hpp
-- Installing: /opt/kokkos/include/Kokkos_Macros.hpp
-- Installing: /opt/kokkos/include/fwd
-- Installing: /opt/kokkos/include/fwd/Kokkos_Fwd_OPENMP.hpp
-- Installing: /opt/kokkos/include/fwd/Kokkos_Fwd_OPENACC.hpp
-- Installing: /opt/kokkos/include/fwd/Kokkos_Fwd_OPENMPTARGET.hpp
-- Installing: /opt/kokkos/include/fwd/Kokkos_Fwd_HBWSpace.hpp
-- Installing: /opt/kokkos/include/fwd/Kokkos_Fwd_SYCL.hpp
-- Installing: /opt/kokkos/include/fwd/Kokkos_Fwd_THREADS.hpp
-- Installing: /opt/kokkos/include/fwd/Kokkos_Fwd_SERIAL.hpp
-- Installing: /opt/kokkos/include/fwd/Kokkos_Fwd_HPX.hpp
-- Installing: /opt/kokkos/include/fwd/Kokkos_Fwd_HIP.hpp
-- Installing: /opt/kokkos/include/fwd/Kokkos_Fwd_CUDA.hpp
-- Installing: /opt/kokkos/include/Kokkos_MathematicalFunctions.hpp
-- Installing: /opt/kokkos/include/OpenMPTarget
-- Installing: /opt/kokkos/include/OpenMPTarget/Kokkos_OpenMPTarget_Parallel_Common.hpp
-- Installing: /opt/kokkos/include/OpenMPTarget/Kokkos_OpenMPTarget_Parallel.hpp
-- Installing: /opt/kokkos/include/OpenMPTarget/Kokkos_OpenMPTarget_Reducer.hpp
-- Installing: /opt/kokkos/include/OpenMPTarget/Kokkos_OpenMPTarget_UniqueToken.hpp
-- Installing: /opt/kokkos/include/OpenMPTarget/Kokkos_OpenMPTarget_ParallelScan_Team.hpp
-- Installing: /opt/kokkos/include/OpenMPTarget/Kokkos_OpenMPTarget_MDRangePolicy.hpp
-- Installing: /opt/kokkos/include/OpenMPTarget/Kokkos_OpenMPTarget.hpp
-- Installing: /opt/kokkos/include/OpenMPTarget/Kokkos_OpenMPTarget_Parallel_MDRange.hpp
-- Installing: /opt/kokkos/include/OpenMPTarget/Kokkos_OpenMPTargetSpace.hpp
-- Installing: /opt/kokkos/include/OpenMPTarget/Kokkos_OpenMPTarget_ParallelScan_Range.hpp
-- Installing: /opt/kokkos/include/OpenMPTarget/Kokkos_OpenMPTarget_Task.hpp
-- Installing: /opt/kokkos/include/OpenMPTarget/Kokkos_OpenMPTarget_ParallelReduce_Range.hpp
-- Installing: /opt/kokkos/include/OpenMPTarget/Kokkos_OpenMPTarget_ParallelReduce_Team.hpp
-- Installing: /opt/kokkos/include/OpenMPTarget/Kokkos_OpenMPTarget_ParallelFor_Team.hpp
-- Installing: /opt/kokkos/include/OpenMPTarget/Kokkos_OpenMPTarget_Error.hpp
-- Installing: /opt/kokkos/include/OpenMPTarget/Kokkos_OpenMPTarget_Instance.hpp
-- Installing: /opt/kokkos/include/OpenMPTarget/Kokkos_OpenMPTarget_ParallelFor_Range.hpp
-- Installing: /opt/kokkos/include/OpenMPTarget/Kokkos_OpenMPTarget_Abort.hpp
-- Installing: /opt/kokkos/include/Kokkos_Parallel_Reduce.hpp
-- Installing: /opt/kokkos/include/Kokkos_Array.hpp
-- Installing: /opt/kokkos/include/Kokkos_MathematicalSpecialFunctions.hpp
-- Installing: /opt/kokkos/include/Kokkos_Graph_fwd.hpp
-- Installing: /opt/kokkos/include/OpenACC
-- Installing: /opt/kokkos/include/OpenACC/Kokkos_OpenACC_SharedAllocationRecord.hpp
-- Installing: /opt/kokkos/include/OpenACC/Kokkos_OpenACC_ParallelReduce_Range.hpp
-- Installing: /opt/kokkos/include/OpenACC/Kokkos_OpenACC_DeepCopy.hpp
-- Installing: /opt/kokkos/include/OpenACC/Kokkos_OpenACC_Traits.hpp
-- Installing: /opt/kokkos/include/OpenACC/Kokkos_OpenACC_Team.hpp
-- Installing: /opt/kokkos/include/OpenACC/Kokkos_OpenACC_ParallelReduce_Team.hpp
-- Installing: /opt/kokkos/include/OpenACC/Kokkos_OpenACC_MDRangePolicy.hpp
-- Installing: /opt/kokkos/include/OpenACC/Kokkos_OpenACC_ParallelFor_Range.hpp
-- Installing: /opt/kokkos/include/OpenACC/Kokkos_OpenACC_ParallelFor_MDRange.hpp
-- Installing: /opt/kokkos/include/OpenACC/Kokkos_OpenACC_Macros.hpp
-- Installing: /opt/kokkos/include/OpenACC/Kokkos_OpenACC_ParallelFor_Team.hpp
-- Installing: /opt/kokkos/include/OpenACC/Kokkos_OpenACC_Instance.hpp
-- Installing: /opt/kokkos/include/OpenACC/Kokkos_OpenACC_ParallelScan_Range.hpp
-- Installing: /opt/kokkos/include/OpenACC/Kokkos_OpenACC.hpp
-- Installing: /opt/kokkos/include/OpenACC/Kokkos_OpenACC_ScheduleType.hpp
-- Installing: /opt/kokkos/include/OpenACC/Kokkos_OpenACC_FunctorAdapter.hpp
-- Installing: /opt/kokkos/include/OpenACC/Kokkos_OpenACCSpace.hpp
-- Installing: /opt/kokkos/include/OpenACC/Kokkos_OpenACC_ParallelReduce_MDRange.hpp
-- Installing: /opt/kokkos/include/decl
-- Installing: /opt/kokkos/include/decl/Kokkos_Declare_SYCL.hpp
-- Installing: /opt/kokkos/include/decl/Kokkos_Declare_OPENMP.hpp
-- Installing: /opt/kokkos/include/decl/Kokkos_Declare_HBWSpace.hpp
-- Installing: /opt/kokkos/include/decl/Kokkos_Declare_HIP.hpp
-- Installing: /opt/kokkos/include/decl/Kokkos_Declare_OPENACC.hpp
-- Installing: /opt/kokkos/include/decl/Kokkos_Declare_SERIAL.hpp
-- Installing: /opt/kokkos/include/decl/Kokkos_Declare_OPENMPTARGET.hpp
-- Installing: /opt/kokkos/include/decl/Kokkos_Declare_THREADS.hpp
-- Installing: /opt/kokkos/include/decl/Kokkos_Declare_CUDA.hpp
-- Installing: /opt/kokkos/include/decl/Kokkos_Declare_HPX.hpp
-- Installing: /opt/kokkos/include/Kokkos_MemoryPool.hpp
-- Installing: /opt/kokkos/include/setup
-- Installing: /opt/kokkos/include/setup/Kokkos_Setup_Cuda.hpp
-- Installing: /opt/kokkos/include/setup/Kokkos_Setup_SYCL.hpp
-- Installing: /opt/kokkos/include/setup/Kokkos_Setup_HIP.hpp
-- Installing: /opt/kokkos/include/Kokkos_WorkGraphPolicy.hpp
-- Installing: /opt/kokkos/include/Kokkos_MinMaxClamp.hpp
-- Installing: /opt/kokkos/include/Kokkos_Tuners.hpp
-- Installing: /opt/kokkos/include/Kokkos_HBWSpace.hpp
-- Installing: /opt/kokkos/include/Kokkos_NumericTraits.hpp
-- Installing: /opt/kokkos/include/Kokkos_MasterLock.hpp
-- Installing: /opt/kokkos/include/Kokkos_AcquireUniqueTokenImpl.hpp
-- Installing: /opt/kokkos/include/SYCL
-- Installing: /opt/kokkos/include/SYCL/Kokkos_SYCL_Instance.hpp
-- Installing: /opt/kokkos/include/SYCL/Kokkos_SYCL_MDRangePolicy.hpp
-- Installing: /opt/kokkos/include/SYCL/Kokkos_SYCL_Half_Conversion.hpp
-- Installing: /opt/kokkos/include/SYCL/Kokkos_SYCL_Parallel_Team.hpp
-- Installing: /opt/kokkos/include/SYCL/Kokkos_SYCL.hpp
-- Installing: /opt/kokkos/include/SYCL/Kokkos_SYCL_Space.hpp
-- Installing: /opt/kokkos/include/SYCL/Kokkos_SYCL_ZeroMemset.hpp
-- Installing: /opt/kokkos/include/SYCL/Kokkos_SYCL_Team.hpp
-- Installing: /opt/kokkos/include/SYCL/Kokkos_SYCL_UniqueToken.hpp
-- Installing: /opt/kokkos/include/SYCL/Kokkos_SYCL_Parallel_Range.hpp
-- Installing: /opt/kokkos/include/SYCL/Kokkos_SYCL_Parallel_Reduce.hpp
-- Installing: /opt/kokkos/include/SYCL/Kokkos_SYCL_DeepCopy.hpp
-- Installing: /opt/kokkos/include/SYCL/Kokkos_SYCL_Abort.hpp
-- Installing: /opt/kokkos/include/SYCL/Kokkos_SYCL_Half_Impl_Type.hpp
-- Installing: /opt/kokkos/include/SYCL/Kokkos_SYCL_Parallel_Scan.hpp
-- Installing: /opt/kokkos/include/Kokkos_Vectorization.hpp
-- Installing: /opt/kokkos/include/Kokkos_Future.hpp
-- Installing: /opt/kokkos/include/HIP
-- Installing: /opt/kokkos/include/HIP/Kokkos_HIP_KernelLaunch.hpp
-- Installing: /opt/kokkos/include/HIP/Kokkos_HIP_Parallel_Team.hpp
-- Installing: /opt/kokkos/include/HIP/Kokkos_HIP_Space.hpp
-- Installing: /opt/kokkos/include/HIP/Kokkos_HIP_ReduceScan.hpp
-- Installing: /opt/kokkos/include/HIP/Kokkos_HIP_Error.hpp
-- Installing: /opt/kokkos/include/HIP/Kokkos_HIP_Instance.hpp
-- Installing: /opt/kokkos/include/HIP/Kokkos_HIP_Team.hpp
-- Installing: /opt/kokkos/include/HIP/Kokkos_HIP_WorkGraphPolicy.hpp
-- Installing: /opt/kokkos/include/HIP/Kokkos_HIP_Abort.hpp
-- Installing: /opt/kokkos/include/HIP/Kokkos_HIP_UniqueToken.hpp
-- Installing: /opt/kokkos/include/HIP/Kokkos_HIP_Vectorization.hpp
-- Installing: /opt/kokkos/include/HIP/Kokkos_HIP_ZeroMemset.hpp
-- Installing: /opt/kokkos/include/HIP/Kokkos_HIP_Half_Conversion.hpp
-- Installing: /opt/kokkos/include/HIP/Kokkos_HIP_Shuffle_Reduce.hpp
-- Installing: /opt/kokkos/include/HIP/Kokkos_HIP_Half_Impl_Type.hpp
-- Installing: /opt/kokkos/include/HIP/Kokkos_HIP_Parallel_Range.hpp
-- Installing: /opt/kokkos/include/HIP/Kokkos_HIP_Parallel_MDRange.hpp
-- Installing: /opt/kokkos/include/HIP/Kokkos_HIP.hpp
-- Installing: /opt/kokkos/include/HIP/Kokkos_HIP_BlockSize_Deduction.hpp
-- Installing: /opt/kokkos/include/HIP/Kokkos_HIP_DeepCopy.hpp
-- Installing: /opt/kokkos/include/HIP/Kokkos_HIP_SharedAllocationRecord.hpp
-- Installing: /opt/kokkos/include/HIP/Kokkos_HIP_MDRangePolicy.hpp
-- Installing: /opt/kokkos/include/impl
-- Installing: /opt/kokkos/include/impl/Kokkos_TaskPolicyData.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_AnalyzePolicy.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_GraphImpl_fwd.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_Volatile_Load.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_MemorySpace.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_TaskQueueCommon.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_Default_GraphNode_Impl.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_DeviceManagement.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_BitOps.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_ParseCommandLineArgumentsAndEnvironmentVariables.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_CPUDiscovery.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_StringManipulation.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_SharedAlloc_timpl.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_HostThreadTeam.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_HostSharedPtr.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_Profiling_DeviceInfo.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_SimpleTaskScheduler.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_HostSpace_deepcopy.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_Profiling_Interface.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_HostSpace_ZeroMemset.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_TaskNode.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_GraphImpl.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_ViewUniformType.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_TaskQueue_impl.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_QuadPrecisionMath.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_ChaseLev.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_GraphNodeCustomization.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_TaskQueueMemoryManager.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_Command_Line_Parsing.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_TaskQueue.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_Half_FloatingPointWrapper.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_Default_Graph_fwd.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_TaskQueueMultiple_impl.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_Spinwait.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_Traits.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_ViewCtor.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_VLAEmulation.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_FixedBufferMemoryPool.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_Profiling.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_ViewTracker.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_Memory_Fence.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_HostBarrier.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_EBO.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_Atomic_View.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_InitializationSettings.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_LinkedListNode.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_TaskResult.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_Default_GraphNodeKernel.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_TaskBase.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_ZeroMemset_fwd.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_SingleTaskQueue.hpp
-- Installing: /opt/kokkos/include/impl/KokkosExp_ViewMapping.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_SharedAlloc.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_Error.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_ClockTic.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_FunctorAnalysis.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_Stacktrace.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_ViewMapping.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_TeamMDPolicy.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_ViewArray.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_LIFO.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_Profiling_C_Interface.h
-- Installing: /opt/kokkos/include/impl/Kokkos_Half_NumericTraits.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_GraphNodeImpl.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_ExecSpaceManager.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_TaskQueueMultiple.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_Tools_Generic.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_GraphImpl_Utilities.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_Default_Graph_Impl.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_MultipleTaskQueue.hpp
-- Installing: /opt/kokkos/include/impl/KokkosExp_IterateTileGPU.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_ConcurrentBitset.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_NvidiaGpuArchitectures.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_ViewLayoutTiled.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_OptionalRef.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_TaskTeamMember.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_Combined_Reducer.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_Utilities.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_Tools.hpp
-- Installing: /opt/kokkos/include/impl/KokkosExp_Host_IterateTile.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_MemoryPoolAllocator.hpp
-- Installing: /opt/kokkos/include/Kokkos_TaskScheduler_fwd.hpp
-- Installing: /opt/kokkos/include/OpenMP
-- Installing: /opt/kokkos/include/OpenMP/Kokkos_OpenMP_Team.hpp
-- Installing: /opt/kokkos/include/OpenMP/Kokkos_OpenMP_UniqueToken.hpp
-- Installing: /opt/kokkos/include/OpenMP/Kokkos_OpenMP_MDRangePolicy.hpp
-- Installing: /opt/kokkos/include/OpenMP/Kokkos_OpenMP.hpp
-- Installing: /opt/kokkos/include/OpenMP/Kokkos_OpenMP_Parallel.hpp
-- Installing: /opt/kokkos/include/OpenMP/Kokkos_OpenMP_Instance.hpp
-- Installing: /opt/kokkos/include/OpenMP/Kokkos_OpenMP_WorkGraphPolicy.hpp
-- Installing: /opt/kokkos/include/OpenMP/Kokkos_OpenMP_Task.hpp
-- Installing: /opt/kokkos/include/Cuda
-- Installing: /opt/kokkos/include/Cuda/Kokkos_Cuda_ZeroMemset.hpp
-- Installing: /opt/kokkos/include/Cuda/Kokkos_Cuda_Parallel_MDRange.hpp
-- Installing: /opt/kokkos/include/Cuda/Kokkos_Cuda_Graph_Impl.hpp
-- Installing: /opt/kokkos/include/Cuda/Kokkos_Cuda_Vectorization.hpp
-- Installing: /opt/kokkos/include/Cuda/Kokkos_CudaSpace.hpp
-- Installing: /opt/kokkos/include/Cuda/Kokkos_Cuda_GraphNodeKernel.hpp
-- Installing: /opt/kokkos/include/Cuda/Kokkos_Cuda_BlockSize_Deduction.hpp
-- Installing: /opt/kokkos/include/Cuda/Kokkos_Cuda_GraphNode_Impl.hpp
-- Installing: /opt/kokkos/include/Cuda/Kokkos_Cuda.hpp
-- Installing: /opt/kokkos/include/Cuda/Kokkos_Cuda_KernelLaunch.hpp
-- Installing: /opt/kokkos/include/Cuda/Kokkos_Cuda_UniqueToken.hpp
-- Installing: /opt/kokkos/include/Cuda/Kokkos_Cuda_abort.hpp
-- Installing: /opt/kokkos/include/Cuda/Kokkos_Cuda_WorkGraphPolicy.hpp
-- Installing: /opt/kokkos/include/Cuda/Kokkos_Cuda_Half_Conversion.hpp
-- Installing: /opt/kokkos/include/Cuda/Kokkos_Cuda_Parallel_Range.hpp
-- Installing: /opt/kokkos/include/Cuda/Kokkos_Cuda_Instance.hpp
-- Installing: /opt/kokkos/include/Cuda/Kokkos_Cuda_Error.hpp
-- Installing: /opt/kokkos/include/Cuda/Kokkos_Cuda_Task.hpp
-- Installing: /opt/kokkos/include/Cuda/Kokkos_Cuda_View.hpp
-- Installing: /opt/kokkos/include/Cuda/Kokkos_Cuda_ReduceScan.hpp
-- Installing: /opt/kokkos/include/Cuda/Kokkos_Cuda_Team.hpp
-- Installing: /opt/kokkos/include/Cuda/Kokkos_Cuda_Half_Impl_Type.hpp
-- Installing: /opt/kokkos/include/Cuda/Kokkos_Cuda_MDRangePolicy.hpp
-- Installing: /opt/kokkos/include/Cuda/Kokkos_Cuda_Parallel_Team.hpp
-- Installing: /opt/kokkos/include/Kokkos_Core_fwd.hpp
-- Installing: /opt/kokkos/include/Kokkos_Layout.hpp
-- Installing: /opt/kokkos/include/Kokkos_ExecPolicy.hpp
-- Installing: /opt/kokkos/include/Kokkos_Graph.hpp
-- Installing: /opt/kokkos/include/Kokkos_Pair.hpp
-- Installing: /opt/kokkos/include/Kokkos_Core.hpp
-- Installing: /opt/kokkos/include/View
-- Installing: /opt/kokkos/include/View/MDSpan
-- Installing: /opt/kokkos/include/View/MDSpan/Kokkos_MDSpan_Header.hpp
-- Installing: /opt/kokkos/include/View/MDSpan/Kokkos_MDSpan_Extents.hpp
-- Installing: /opt/kokkos/include/View/Hooks
-- Installing: /opt/kokkos/include/View/Hooks/Kokkos_ViewHooks.hpp
-- Installing: /opt/kokkos/include/desul
-- Installing: /opt/kokkos/include/desul/atomics
-- Installing: /opt/kokkos/include/desul/atomics/Compare_Exchange_SYCL.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Lock_Based_Fetch_Op_SYCL.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Lock_Based_Fetch_Op_Host.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Thread_Fence.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Adapt_GCC.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Thread_Fence_SYCL.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Thread_Fence_ScopeCaller.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Adapt_SYCL.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Atomic_Ref.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Compare_Exchange_OpenMP.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Fetch_Op_CUDA.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Adapt_CXX.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Fetch_Op.hpp
-- Installing: /opt/kokkos/include/desul/atomics/openmp
-- Installing: /opt/kokkos/include/desul/atomics/openmp/OpenMP_40.hpp
-- Installing: /opt/kokkos/include/desul/atomics/openmp/OpenMP_40_op.inc
-- Installing: /opt/kokkos/include/desul/atomics/Lock_Array_CUDA.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Fetch_Op_GCC.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Lock_Based_Fetch_Op.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Thread_Fence_GCC.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Generic.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Lock_Array_HIP.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Operator_Function_Objects.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Compare_Exchange_CUDA.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Compare_Exchange_GCC.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Lock_Based_Fetch_Op_CUDA.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Lock_Free_Fetch_Op.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Lock_Based_Fetch_Op_HIP.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Fetch_Op_Generic.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Thread_Fence_CUDA.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Common.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Compare_Exchange_ScopeCaller.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Compare_Exchange_MSVC.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Fetch_Op_ScopeCaller.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Lock_Array.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Fetch_Op_SYCL.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Fetch_Op_HIP.hpp
-- Installing: /opt/kokkos/include/desul/atomics/cuda
-- Installing: /opt/kokkos/include/desul/atomics/cuda/CUDA_asm.hpp
-- Installing: /opt/kokkos/include/desul/atomics/cuda/cuda_cc7_asm_atomic_op.inc_predicate
-- Installing: /opt/kokkos/include/desul/atomics/cuda/cuda_cc7_asm_exchange_op.inc
-- Installing: /opt/kokkos/include/desul/atomics/cuda/cuda_cc7_asm_atomic_op.inc_generic
-- Installing: /opt/kokkos/include/desul/atomics/cuda/CUDA_asm_exchange.hpp
-- Installing: /opt/kokkos/include/desul/atomics/cuda/cuda_cc7_asm_exchange_memorder.inc
-- Installing: /opt/kokkos/include/desul/atomics/cuda/cuda_cc7_asm_memorder.inc
-- Installing: /opt/kokkos/include/desul/atomics/cuda/cuda_cc7_asm_atomic_fetch_op.inc_generic
-- Installing: /opt/kokkos/include/desul/atomics/cuda/cuda_cc7_asm_atomic_fetch_op.inc_predicate
-- Installing: /opt/kokkos/include/desul/atomics/cuda/cuda_cc7_asm.inc
-- Installing: /opt/kokkos/include/desul/atomics/cuda/cuda_cc7_asm_atomic_op.inc
-- Installing: /opt/kokkos/include/desul/atomics/cuda/cuda_cc7_asm_atomic_fetch_op.inc_forceglobal
-- Installing: /opt/kokkos/include/desul/atomics/cuda/cuda_cc7_asm_atomic_op.inc_isglobal
-- Installing: /opt/kokkos/include/desul/atomics/cuda/cuda_cc7_asm_exchange.inc
-- Installing: /opt/kokkos/include/desul/atomics/cuda/cuda_cc7_asm_atomic_fetch_op.inc
-- Installing: /opt/kokkos/include/desul/atomics/cuda/cuda_cc7_asm_atomic_op.inc_forceglobal
-- Installing: /opt/kokkos/include/desul/atomics/cuda/cuda_cc7_asm_atomic_fetch_op.inc_isglobal
-- Installing: /opt/kokkos/include/desul/atomics/Lock_Array_SYCL.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Thread_Fence_HIP.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Thread_Fence_MSVC.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Macros.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Fetch_Op_OpenMP.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Compare_Exchange.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Thread_Fence_OpenMP.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Compare_Exchange_HIP.hpp
-- Installing: /opt/kokkos/include/desul/atomics.hpp
-- Up-to-date: /opt/kokkos/include/desul
-- Up-to-date: /opt/kokkos/include/desul/atomics
-- Installing: /opt/kokkos/include/desul/atomics/Config.hpp
-- Installing: /opt/kokkos/lib/libkokkoscore.a
-- Up-to-date: /opt/kokkos/lib/libkokkoscore.a
-- Up-to-date: /opt/kokkos/include
-- Installing: /opt/kokkos/include/Kokkos_OffsetView.hpp
-- Installing: /opt/kokkos/include/Kokkos_Functional.hpp
-- Installing: /opt/kokkos/include/Kokkos_Vector.hpp
-- Installing: /opt/kokkos/include/Kokkos_Bitset.hpp
-- Installing: /opt/kokkos/include/Kokkos_ErrorReporter.hpp
-- Installing: /opt/kokkos/include/Kokkos_DualView.hpp
-- Installing: /opt/kokkos/include/Kokkos_UnorderedMap.hpp
-- Installing: /opt/kokkos/include/Kokkos_ScatterView.hpp
-- Installing: /opt/kokkos/include/Kokkos_DynamicView.hpp
-- Up-to-date: /opt/kokkos/include/impl
-- Installing: /opt/kokkos/include/impl/Kokkos_Functional_impl.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_UnorderedMap_impl.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_Bitset_impl.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_StaticCrsGraph_factory.hpp
-- Installing: /opt/kokkos/include/Kokkos_StaticCrsGraph.hpp
-- Installing: /opt/kokkos/include/Kokkos_DynRankView.hpp
-- Installing: /opt/kokkos/lib/libkokkoscontainers.a
-- Up-to-date: /opt/kokkos/lib/libkokkoscontainers.a
-- Up-to-date: /opt/kokkos/include
-- Installing: /opt/kokkos/include/Kokkos_Random.hpp
-- Installing: /opt/kokkos/include/Kokkos_StdAlgorithms.hpp
-- Installing: /opt/kokkos/include/std_algorithms
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_Count.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_ReverseCopy.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_Rotate.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_Fill.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_IsSorted.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_CountIf.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_PartitionCopy.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_RemoveCopy.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_CopyIf.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_MaxElement.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_Reduce.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_IsSortedUntil.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_ReplaceCopy.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_BeginEnd.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_TransformInclusiveScan.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_RotateCopy.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_Search.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_Distance.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_IterSwap.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_FillN.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_AdjacentFind.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_FindEnd.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_FindIfNot.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_UniqueCopy.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_Move.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_AnyOf.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_AdjacentDifference.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_InclusiveScan.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_MinElement.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_ShiftRight.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_Reverse.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_IsPartitioned.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_Transform.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_ShiftLeft.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_ForEach.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_FindFirstOf.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_Remove.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_SwapRanges.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_Equal.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_Find.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_GenerateN.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_PartitionPoint.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_CopyBackward.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_LexicographicalCompare.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_Unique.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_Generate.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_TransformReduce.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_Replace.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_TransformExclusiveScan.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_ReplaceCopyIf.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_Swap.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_RemoveCopyIf.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_SearchN.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_ReplaceIf.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_RemoveIf.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_MinMaxElement.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_Mismatch.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_ReverseCopy.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_ReducerWithArbitraryJoinerNoNeutralElement.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_Rotate.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_IsSorted.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_PartitionCopy.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_CopyIf.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_FindIfOrNot.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_IdentityReferenceUnaryFunctor.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_CopyCopyN.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_Reduce.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_HelperPredicates.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_IsSortedUntil.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_ReplaceCopy.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_TransformInclusiveScan.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_RotateCopy.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_Constraints.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_Search.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_ForEachForEachN.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_AdjacentFind.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_FindEnd.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_UniqueCopy.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_Move.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_AdjacentDifference.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_InclusiveScan.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_ShiftRight.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_MinMaxMinmaxElement.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_Reverse.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_IsPartitioned.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_Transform.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_ShiftLeft.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_FindFirstOf.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_SwapRanges.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_ValueWrapperForNoNeutralElement.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_Equal.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_GenerateGenerateN.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_PartitionPoint.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_CopyBackward.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_LexicographicalCompare.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_Unique.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_FillFillN.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_TransformReduce.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_Replace.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_TransformExclusiveScan.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_ReplaceCopyIf.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_RemoveAllVariants.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_CountCountIf.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_SearchN.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_ReplaceIf.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_AllOfAnyOfNoneOf.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_Mismatch.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_ExclusiveScan.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_MoveBackward.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_RandomAccessIterator.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_AllOf.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_Copy.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_CopyN.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_ExclusiveScan.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_MoveBackward.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_FindIf.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_ForEachN.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_NoneOf.hpp
-- Installing: /opt/kokkos/include/Kokkos_Sort.hpp
-- Installing: /opt/kokkos/include/Kokkos_NestedSort.hpp
-- Up-to-date: /opt/kokkos/include
-- Installing: /opt/kokkos/include/Kokkos_SIMD_Common.hpp
-- Installing: /opt/kokkos/include/Kokkos_SIMD_NEON.hpp
-- Installing: /opt/kokkos/include/Kokkos_SIMD_AVX2.hpp
-- Installing: /opt/kokkos/include/Kokkos_SIMD.hpp
-- Installing: /opt/kokkos/include/Kokkos_SIMD_Scalar.hpp
-- Installing: /opt/kokkos/include/Kokkos_SIMD_AVX512.hpp
-- Installing: /opt/kokkos/lib/libkokkossimd.a
-- Up-to-date: /opt/kokkos/lib/libkokkossimd.a
-- Installing: /opt/kokkos/lib/cmake/Kokkos/KokkosConfig.cmake
-- Installing: /opt/kokkos/lib/cmake/Kokkos/KokkosConfigCommon.cmake
-- Installing: /opt/kokkos/lib/cmake/Kokkos/KokkosConfigVersion.cmake
-- Installing: /opt/kokkos/lib/cmake/Kokkos/KokkosTargets.cmake
-- Installing: /opt/kokkos/lib/cmake/Kokkos/KokkosTargets-debug.cmake
-- Installing: /opt/kokkos/include/KokkosCore_config.h
-- Installing: /opt/kokkos/bin/nvcc_wrapper
-- Installing: /opt/kokkos/bin/hpcbind
-- Installing: /opt/kokkos/bin/kokkos_launch_compiler
-- Up-to-date: /opt/kokkos/include/KokkosCore_config.h
-- Installing: /opt/kokkos/include/KokkosCore_Config_FwdBackend.hpp
-- Installing: /opt/kokkos/include/KokkosCore_Config_SetupBackend.hpp
-- Installing: /opt/kokkos/include/KokkosCore_Config_DeclareBackend.hpp
-- Installing: /opt/kokkos/include/KokkosCore_Config_PostInclude.hpp
 ---> Removed intermediate container f592ad746c19
 ---> 17715c9c5786
Step 16/25 : ENV ARBORX_DIR=/opt/arborx
 ---> Running in 6e375e80f955
 ---> Removed intermediate container 6e375e80f955
 ---> 1c4b981153b9
Step 17/25 : RUN ARBORX_VERSION=v1.4 &&     ARBORX_URL=https://github.com/arborx/ArborX/archive/${ARBORX_VERSION}.tar.gz &&     ARBORX_ARCHIVE=arborx.tar.gz &&     wget --quiet ${ARBORX_URL} --output-document=${ARBORX_ARCHIVE} &&     mkdir arborx &&     tar -xf ${ARBORX_ARCHIVE} -C arborx --strip-components=1 &&     cd arborx &&     mkdir -p build && cd build &&     cmake       -D CMAKE_INSTALL_PREFIX=${ARBORX_DIR}       -D CMAKE_BUILD_TYPE=Debug       -D CMAKE_CXX_COMPILER=${KOKKOS_DIR}/bin/nvcc_wrapper       -D CMAKE_CXX_EXTENSIONS=OFF       -D CMAKE_PREFIX_PATH=${KOKKOS_DIR}     .. &&     make -j${NPROCS} install &&     cd ../.. && rm -r arborx
 ---> Running in fcb7ecf69b3a
-- The CXX compiler identification is GNU 9.4.0
-- Check for working CXX compiler: /opt/kokkos/bin/nvcc_wrapper
-- Check for working CXX compiler: /opt/kokkos/bin/nvcc_wrapper -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Enabled Kokkos devices: CUDA;SERIAL
-- Found Kokkos: /opt/kokkos/lib/cmake/Kokkos (version "4.1.00")
-- Found Kokkos_OPTIONS: CUDA_LAMBDA  
-- ArborX hash = 'No hash available'
-- Configuring done
-- Generating done
-- Build files have been written to: /arborx/build
Scanning dependencies of target record_hash
-- ArborX hash = 'No hash available'
Built target record_hash
Install the project...
-- Install configuration: "Debug"
-- Installing: /opt/arborx/lib/cmake/ArborX/ArborXTargets.cmake
-- Installing: /opt/arborx/lib/cmake/ArborX/ArborXConfig.cmake
-- Installing: /opt/arborx/lib/cmake/ArborX/ArborXConfigVersion.cmake
-- Installing: /opt/arborx/lib/cmake/ArborX/ArborXSettings.cmake
-- Installing: /opt/arborx/include/ArborX
-- Installing: /opt/arborx/include/ArborX/ArborX_HDBSCAN.hpp
-- Installing: /opt/arborx/include/ArborX/ArborX_DBSCAN.hpp
-- Installing: /opt/arborx/include/ArborX/ArborX_LinearBVH.hpp
-- Installing: /opt/arborx/include/ArborX/ArborX.hpp
-- Installing: /opt/arborx/include/ArborX/kokkos_ext
-- Installing: /opt/arborx/include/ArborX/kokkos_ext/ArborX_DetailsKokkosExtVersion.hpp
-- Installing: /opt/arborx/include/ArborX/kokkos_ext/ArborX_DetailsKokkosExtSort.hpp
-- Installing: /opt/arborx/include/ArborX/kokkos_ext/ArborX_DetailsKokkosExtViewHelpers.hpp
-- Installing: /opt/arborx/include/ArborX/kokkos_ext/ArborX_DetailsKokkosExtScopedProfileRegion.hpp
-- Installing: /opt/arborx/include/ArborX/kokkos_ext/ArborX_DetailsKokkosExtSwap.hpp
-- Installing: /opt/arborx/include/ArborX/kokkos_ext/ArborX_DetailsKokkosExtClassLambda.hpp
-- Installing: /opt/arborx/include/ArborX/kokkos_ext/ArborX_DetailsKokkosExtAccessibilityTraits.hpp
-- Installing: /opt/arborx/include/ArborX/kokkos_ext/ArborX_DetailsKokkosExtMinMaxOperations.hpp
-- Installing: /opt/arborx/include/ArborX/kokkos_ext/ArborX_DetailsKokkosExtArithmeticTraits.hpp
-- Installing: /opt/arborx/include/ArborX/details
-- Installing: /opt/arborx/include/ArborX/details/ArborX_DetailsHeap.hpp
-- Installing: /opt/arborx/include/ArborX/details/ArborX_DetailsCrsGraphWrapperImpl.hpp
-- Installing: /opt/arborx/include/ArborX/details/ArborX_DetailsStack.hpp
-- Installing: /opt/arborx/include/ArborX/details/ArborX_DetailsDendrogram.hpp
-- Installing: /opt/arborx/include/ArborX/details/ArborX_DetailsTreeNodeLabeling.hpp
-- Installing: /opt/arborx/include/ArborX/details/ArborX_DetailsFDBSCAN.hpp
-- Installing: /opt/arborx/include/ArborX/details/ArborX_PairIndexRank.hpp
-- Installing: /opt/arborx/include/ArborX/details/ArborX_Dendrogram.hpp
-- Installing: /opt/arborx/include/ArborX/details/ArborX_DetailsCartesianGrid.hpp
-- Installing: /opt/arborx/include/ArborX/details/ArborX_DetailsUnionFind.hpp
-- Installing: /opt/arborx/include/ArborX/details/ArborX_DetailsNode.hpp
-- Installing: /opt/arborx/include/ArborX/details/ArborX_DetailsUtils.hpp
-- Installing: /opt/arborx/include/ArborX/details/ArborX_DetailsFDBSCANDenseBox.hpp
-- Installing: /opt/arborx/include/ArborX/details/ArborX_DetailsTreeTraversal.hpp
-- Installing: /opt/arborx/include/ArborX/details/ArborX_MinimumSpanningTree.hpp
-- Installing: /opt/arborx/include/ArborX/details/ArborX_DetailsHappyTreeFriends.hpp
-- Installing: /opt/arborx/include/ArborX/details/ArborX_Callbacks.hpp
-- Installing: /opt/arborx/include/ArborX/details/ArborX_TraversalPolicy.hpp
-- Installing: /opt/arborx/include/ArborX/details/ArborX_DetailsOperatorFunctionObjects.hpp
-- Installing: /opt/arborx/include/ArborX/details/ArborX_DetailsExpandHalfToFull.hpp
-- Installing: /opt/arborx/include/ArborX/details/ArborX_DetailsBatchedQueries.hpp
-- Installing: /opt/arborx/include/ArborX/details/ArborX_Predicates.hpp
-- Installing: /opt/arborx/include/ArborX/details/ArborX_DetailsContainers.hpp
-- Installing: /opt/arborx/include/ArborX/details/ArborX_DetailsTreeVisualization.hpp
-- Installing: /opt/arborx/include/ArborX/details/ArborX_DetailsMutualReachabilityDistance.hpp
-- Installing: /opt/arborx/include/ArborX/details/ArborX_DetailsPermutedData.hpp
-- Installing: /opt/arborx/include/ArborX/details/ArborX_DetailsHalfTraversal.hpp
-- Installing: /opt/arborx/include/ArborX/details/ArborX_DetailsWeightedEdge.hpp
-- Installing: /opt/arborx/include/ArborX/details/ArborX_DetailsMortonCode.hpp
-- Installing: /opt/arborx/include/ArborX/details/ArborX_DetailsTreeConstruction.hpp
-- Installing: /opt/arborx/include/ArborX/details/ArborX_AccessTraits.hpp
-- Installing: /opt/arborx/include/ArborX/details/ArborX_Exception.hpp
-- Installing: /opt/arborx/include/ArborX/details/ArborX_DetailsPriorityQueue.hpp
-- Installing: /opt/arborx/include/ArborX/details/ArborX_DetailsSortUtils.hpp
-- Installing: /opt/arborx/include/ArborX/details/ArborX_SpaceFillingCurves.hpp
-- Installing: /opt/arborx/include/ArborX/details/ArborX_NeighborList.hpp
-- Installing: /opt/arborx/include/ArborX/details/ArborX_DetailsBruteForceImpl.hpp
-- Installing: /opt/arborx/include/ArborX/ArborX_BruteForce.hpp
-- Installing: /opt/arborx/include/ArborX/geometry
-- Installing: /opt/arborx/include/ArborX/geometry/ArborX_Point.hpp
-- Installing: /opt/arborx/include/ArborX/geometry/ArborX_DetailsAlgorithms.hpp
-- Installing: /opt/arborx/include/ArborX/geometry/ArborX_HyperSphere.hpp
-- Installing: /opt/arborx/include/ArborX/geometry/ArborX_HyperPoint.hpp
-- Installing: /opt/arborx/include/ArborX/geometry/ArborX_HyperBox.hpp
-- Installing: /opt/arborx/include/ArborX/geometry/ArborX_Sphere.hpp
-- Installing: /opt/arborx/include/ArborX/geometry/ArborX_GeometryTraits.hpp
-- Installing: /opt/arborx/include/ArborX/geometry/ArborX_Box.hpp
-- Installing: /opt/arborx/include/ArborX/geometry/ArborX_Ray.hpp
-- Installing: /opt/arborx/include/ArborX/geometry/ArborX_KDOP.hpp
-- Installing: /opt/arborx/include/ArborX/geometry/ArborX_Triangle.hpp
-- Installing: /opt/arborx/include/ArborX/ArborX_CrsGraphWrapper.hpp
-- Up-to-date: /opt/arborx/include/ArborX
-- Installing: /opt/arborx/include/ArborX/ArborX_Config.hpp
-- Installing: /opt/arborx/include/ArborX/ArborX_Version.hpp
 ---> Removed intermediate container fcb7ecf69b3a
 ---> e50ba614de3b
Step 18/25 : ARG FFTW_VERSION=3.3.8
 ---> Running in 4b2f128da84a
 ---> Removed intermediate container 4b2f128da84a
 ---> de0a2fead39e
Step 19/25 : ENV FFTW_DIR=/opt/fftw
 ---> Running in 665a2373005a
 ---> Removed intermediate container 665a2373005a
 ---> 8b212ab2f7e6
Step 20/25 : RUN FFTW_URL=http://www.fftw.org/fftw-${FFTW_VERSION}.tar.gz &&     FFTW_ARCHIVE=fftw.tar.gz &&     SCRATCH_DIR=/scratch && mkdir -p ${SCRATCH_DIR} && cd ${SCRATCH_DIR} &&     wget --quiet ${FFTW_URL} --output-document=${FFTW_ARCHIVE} &&     mkdir -p fftw &&     tar -xf ${FFTW_ARCHIVE} -C fftw --strip-components=1 &&     cd fftw &&     mkdir -p build && cd build &&     cmake       -D CMAKE_INSTALL_PREFIX=${FFTW_DIR}       -D CMAKE_BUILD_TYPE=Debug       -D ENABLE_FLOAT=ON     .. &&     make -j${NPROCS} install &&     cmake       -D CMAKE_INSTALL_PREFIX=${FFTW_DIR}       -D CMAKE_BUILD_TYPE=Debug       -D ENABLE_FLOAT=OFF     .. &&     make -j${NPROCS} install &&     rm -rf ${SCRATCH_DIR}
 ---> Running in b4a543a9e004
-- The C compiler identification is GNU 9.4.0
-- The CXX compiler identification is GNU 9.4.0
-- Check for working C compiler: /usr/bin/cc
-- Check for working C compiler: /usr/bin/cc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Detecting C compile features
-- Detecting C compile features - done
-- Check for working CXX compiler: /usr/bin/c++
-- Check for working CXX compiler: /usr/bin/c++ -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Looking for alloca.h
-- Looking for alloca.h - found
-- Looking for altivec.h
-- Looking for altivec.h - not found
-- Looking for c_asm.h
-- Looking for c_asm.h - not found
-- Looking for dlfcn.h
-- Looking for dlfcn.h - found
-- Looking for intrinsics.h
-- Looking for intrinsics.h - not found
-- Looking for inttypes.h
-- Looking for inttypes.h - found
-- Looking for libintl.h
-- Looking for libintl.h - found
-- Looking for limits.h
-- Looking for limits.h - found
-- Looking for mach/mach_time.h
-- Looking for mach/mach_time.h - not found
-- Looking for malloc.h
-- Looking for malloc.h - found
-- Looking for memory.h
-- Looking for memory.h - found
-- Looking for stddef.h
-- Looking for stddef.h - found
-- Looking for stdint.h
-- Looking for stdint.h - found
-- Looking for stdlib.h
-- Looking for stdlib.h - found
-- Looking for string.h
-- Looking for string.h - found
-- Looking for strings.h
-- Looking for strings.h - found
-- Looking for sys/types.h
-- Looking for sys/types.h - found
-- Looking for sys/time.h
-- Looking for sys/time.h - found
-- Looking for sys/stat.h
-- Looking for sys/stat.h - found
-- Looking for sys/sysctl.h
-- Looking for sys/sysctl.h - found
-- Looking for time.h
-- Looking for time.h - found
-- Looking for uintptr.h
-- Looking for uintptr.h - not found
-- Looking for unistd.h
-- Looking for unistd.h - found
-- Checking prototype drand48 for HAVE_DECL_DRAND48 - True
-- Checking prototype srand48 for HAVE_DECL_SRAND48 - True
-- Checking prototype cosl for HAVE_DECL_COSL - True
-- Checking prototype sinl for HAVE_DECL_SINL - True
-- Checking prototype memalign for HAVE_DECL_MEMALIGN - True
-- Checking prototype posix_memalign for HAVE_DECL_POSIX_MEMALIGN - True
-- Looking for clock_gettime
-- Looking for clock_gettime - found
-- Looking for gettimeofday
-- Looking for gettimeofday - found
-- Looking for getpagesize
-- Looking for getpagesize - found
-- Looking for drand48
-- Looking for drand48 - found
-- Looking for srand48
-- Looking for srand48 - found
-- Looking for memalign
-- Looking for memalign - found
-- Looking for posix_memalign
-- Looking for posix_memalign - found
-- Looking for mach_absolute_time
-- Looking for mach_absolute_time - not found
-- Looking for alloca
-- Looking for alloca - found
-- Looking for isnan
-- Looking for isnan - found
-- Looking for snprintf
-- Looking for snprintf - found
-- Looking for strchr
-- Looking for strchr - found
-- Looking for sysctl
-- Looking for sysctl - not found
-- Looking for cosl
-- Looking for cosl - found
-- Looking for sinl
-- Looking for sinl - found
-- Check size of float
-- Check size of float - done
-- Check size of double
-- Check size of double - done
-- Check size of int
-- Check size of int - done
-- Check size of long
-- Check size of long - done
-- Check size of long long
-- Check size of long long - done
-- Check size of unsigned int
-- Check size of unsigned int - done
-- Check size of unsigned long
-- Check size of unsigned long - done
-- Check size of unsigned long long
-- Check size of unsigned long long - done
-- Check size of size_t
-- Check size of size_t - done
-- Check size of ptrdiff_t
-- Check size of ptrdiff_t - done
-- Configuring done
-- Generating done
-- Build files have been written to: /scratch/fftw/build
Scanning dependencies of target fftw3f
[  1%] Building C object CMakeFiles/fftw3f.dir/api/apiplan.c.o
[  1%] Building C object CMakeFiles/fftw3f.dir/api/configure.c.o
[  1%] Building C object CMakeFiles/fftw3f.dir/api/execute-dft-r2c.c.o
[  1%] Building C object CMakeFiles/fftw3f.dir/api/execute-dft-c2r.c.o
[  2%] Building C object CMakeFiles/fftw3f.dir/api/execute-dft.c.o
[  2%] Building C object CMakeFiles/fftw3f.dir/api/execute-r2r.c.o
[  2%] Building C object CMakeFiles/fftw3f.dir/api/execute-split-dft-c2r.c.o
[  2%] Building C object CMakeFiles/fftw3f.dir/api/execute-split-dft-r2c.c.o
[  3%] Building C object CMakeFiles/fftw3f.dir/api/execute-split-dft.c.o
[  3%] Building C object CMakeFiles/fftw3f.dir/api/execute.c.o
[  3%] Building C object CMakeFiles/fftw3f.dir/api/export-wisdom-to-file.c.o
[  3%] Building C object CMakeFiles/fftw3f.dir/api/export-wisdom-to-string.c.o
[  3%] Building C object CMakeFiles/fftw3f.dir/api/export-wisdom.c.o
[  3%] Building C object CMakeFiles/fftw3f.dir/api/f77api.c.o
[  4%] Building C object CMakeFiles/fftw3f.dir/api/flops.c.o
[  4%] Building C object CMakeFiles/fftw3f.dir/api/forget-wisdom.c.o
[  4%] Building C object CMakeFiles/fftw3f.dir/api/import-system-wisdom.c.o
[  4%] Building C object CMakeFiles/fftw3f.dir/api/import-wisdom-from-file.c.o
[  4%] Building C object CMakeFiles/fftw3f.dir/api/import-wisdom-from-string.c.o
[  5%] Building C object CMakeFiles/fftw3f.dir/api/import-wisdom.c.o
[  5%] Building C object CMakeFiles/fftw3f.dir/api/malloc.c.o
[  5%] Building C object CMakeFiles/fftw3f.dir/api/map-r2r-kind.c.o
[  5%] Building C object CMakeFiles/fftw3f.dir/api/mapflags.c.o
[  6%] Building C object CMakeFiles/fftw3f.dir/api/mkprinter-file.c.o
[  6%] Building C object CMakeFiles/fftw3f.dir/api/mkprinter-str.c.o
[  6%] Building C object CMakeFiles/fftw3f.dir/api/mktensor-iodims.c.o
[  6%] Building C object CMakeFiles/fftw3f.dir/api/mktensor-iodims64.c.o
[  6%] Building C object CMakeFiles/fftw3f.dir/api/mktensor-rowmajor.c.o
[  7%] Building C object CMakeFiles/fftw3f.dir/api/plan-dft-1d.c.o
[  7%] Building C object CMakeFiles/fftw3f.dir/api/plan-dft-2d.c.o
[  7%] Building C object CMakeFiles/fftw3f.dir/api/plan-dft-3d.c.o
[  7%] Building C object CMakeFiles/fftw3f.dir/api/plan-dft-c2r-1d.c.o
[  7%] Building C object CMakeFiles/fftw3f.dir/api/plan-dft-c2r-2d.c.o
[  8%] Building C object CMakeFiles/fftw3f.dir/api/plan-dft-c2r-3d.c.o
[  8%] Building C object CMakeFiles/fftw3f.dir/api/plan-dft-c2r.c.o
[  8%] Building C object CMakeFiles/fftw3f.dir/api/plan-dft-r2c-1d.c.o
[  8%] Building C object CMakeFiles/fftw3f.dir/api/plan-dft-r2c-2d.c.o
[  9%] Building C object CMakeFiles/fftw3f.dir/api/plan-dft-r2c-3d.c.o
[  9%] Building C object CMakeFiles/fftw3f.dir/api/plan-dft-r2c.c.o
[  9%] Building C object CMakeFiles/fftw3f.dir/api/plan-dft.c.o
[  9%] Building C object CMakeFiles/fftw3f.dir/api/plan-guru-dft-c2r.c.o
[  9%] Building C object CMakeFiles/fftw3f.dir/api/plan-guru-dft-r2c.c.o
[ 10%] Building C object CMakeFiles/fftw3f.dir/api/plan-guru-dft.c.o
[ 10%] Building C object CMakeFiles/fftw3f.dir/api/plan-guru-r2r.c.o
[ 10%] Building C object CMakeFiles/fftw3f.dir/api/plan-guru-split-dft-c2r.c.o
[ 10%] Building C object CMakeFiles/fftw3f.dir/api/plan-guru-split-dft-r2c.c.o
[ 10%] Building C object CMakeFiles/fftw3f.dir/api/plan-guru-split-dft.c.o
[ 11%] Building C object CMakeFiles/fftw3f.dir/api/plan-guru64-dft-c2r.c.o
[ 11%] Building C object CMakeFiles/fftw3f.dir/api/plan-guru64-dft-r2c.c.o
[ 11%] Building C object CMakeFiles/fftw3f.dir/api/plan-guru64-dft.c.o
[ 11%] Building C object CMakeFiles/fftw3f.dir/api/plan-guru64-r2r.c.o
[ 12%] Building C object CMakeFiles/fftw3f.dir/api/plan-guru64-split-dft-c2r.c.o
[ 12%] Building C object CMakeFiles/fftw3f.dir/api/plan-guru64-split-dft-r2c.c.o
[ 12%] Building C object CMakeFiles/fftw3f.dir/api/plan-guru64-split-dft.c.o
[ 12%] Building C object CMakeFiles/fftw3f.dir/api/plan-many-dft-c2r.c.o
[ 12%] Building C object CMakeFiles/fftw3f.dir/api/plan-many-dft-r2c.c.o
[ 12%] Building C object CMakeFiles/fftw3f.dir/api/plan-many-r2r.c.o
[ 13%] Building C object CMakeFiles/fftw3f.dir/api/plan-many-dft.c.o
[ 13%] Building C object CMakeFiles/fftw3f.dir/api/plan-r2r-2d.c.o
[ 13%] Building C object CMakeFiles/fftw3f.dir/api/plan-r2r-1d.c.o
[ 13%] Building C object CMakeFiles/fftw3f.dir/api/plan-r2r-3d.c.o
[ 14%] Building C object CMakeFiles/fftw3f.dir/api/print-plan.c.o
[ 14%] Building C object CMakeFiles/fftw3f.dir/api/rdft2-pad.c.o
[ 14%] Building C object CMakeFiles/fftw3f.dir/api/plan-r2r.c.o
[ 14%] Building C object CMakeFiles/fftw3f.dir/api/the-planner.c.o
[ 15%] Building C object CMakeFiles/fftw3f.dir/api/version.c.o
[ 15%] Building C object CMakeFiles/fftw3f.dir/dft/bluestein.c.o
[ 15%] Building C object CMakeFiles/fftw3f.dir/dft/buffered.c.o
[ 15%] Building C object CMakeFiles/fftw3f.dir/dft/conf.c.o
[ 15%] Building C object CMakeFiles/fftw3f.dir/dft/ct.c.o
[ 16%] Building C object CMakeFiles/fftw3f.dir/dft/dftw-direct.c.o
[ 16%] Building C object CMakeFiles/fftw3f.dir/dft/dftw-directsq.c.o
[ 16%] Building C object CMakeFiles/fftw3f.dir/dft/dftw-generic.c.o
[ 16%] Building C object CMakeFiles/fftw3f.dir/dft/dftw-genericbuf.c.o
[ 16%] Building C object CMakeFiles/fftw3f.dir/dft/direct.c.o
[ 17%] Building C object CMakeFiles/fftw3f.dir/dft/generic.c.o
[ 17%] Building C object CMakeFiles/fftw3f.dir/dft/indirect-transpose.c.o
[ 17%] Building C object CMakeFiles/fftw3f.dir/dft/indirect.c.o
[ 17%] Building C object CMakeFiles/fftw3f.dir/dft/kdft-dif.c.o
[ 18%] Building C object CMakeFiles/fftw3f.dir/dft/kdft-difsq.c.o
[ 18%] Building C object CMakeFiles/fftw3f.dir/dft/kdft-dit.c.o
[ 18%] Building C object CMakeFiles/fftw3f.dir/dft/kdft.c.o
[ 18%] Building C object CMakeFiles/fftw3f.dir/dft/nop.c.o
[ 18%] Building C object CMakeFiles/fftw3f.dir/dft/plan.c.o
[ 19%] Building C object CMakeFiles/fftw3f.dir/dft/problem.c.o
[ 19%] Building C object CMakeFiles/fftw3f.dir/dft/rader.c.o
[ 19%] Building C object CMakeFiles/fftw3f.dir/dft/rank-geq2.c.o
[ 19%] Building C object CMakeFiles/fftw3f.dir/dft/solve.c.o
[ 19%] Building C object CMakeFiles/fftw3f.dir/dft/vrank-geq1.c.o
[ 20%] Building C object CMakeFiles/fftw3f.dir/dft/zero.c.o
[ 20%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/n.c.o
[ 20%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/t.c.o
[ 20%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/codlist.c.o
[ 21%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/n1_10.c.o
[ 21%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/n1_11.c.o
[ 21%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/n1_12.c.o
[ 21%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/n1_13.c.o
[ 21%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/n1_14.c.o
[ 22%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/n1_15.c.o
[ 22%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/n1_16.c.o
[ 22%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/n1_2.c.o
[ 22%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/n1_20.c.o
[ 22%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/n1_25.c.o
[ 23%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/n1_3.c.o
[ 23%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/n1_32.c.o
[ 23%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/n1_4.c.o
[ 23%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/n1_5.c.o
[ 24%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/n1_6.c.o
[ 24%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/n1_64.c.o
[ 24%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/n1_7.c.o
[ 24%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/n1_8.c.o
[ 24%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/n1_9.c.o
[ 25%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/q1_2.c.o
[ 25%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/q1_3.c.o
[ 25%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/q1_4.c.o
[ 25%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/q1_5.c.o
[ 25%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/q1_6.c.o
[ 26%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/q1_8.c.o
[ 26%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/t1_10.c.o
[ 26%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/t1_12.c.o
[ 26%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/t1_15.c.o
[ 27%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/t1_16.c.o
[ 27%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/t1_2.c.o
[ 27%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/t1_20.c.o
[ 27%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/t1_25.c.o
[ 27%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/t1_3.c.o
[ 28%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/t1_32.c.o
[ 28%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/t1_4.c.o
[ 28%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/t1_5.c.o
[ 28%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/t1_6.c.o
[ 28%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/t1_64.c.o
[ 29%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/t1_7.c.o
[ 29%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/t1_8.c.o
[ 29%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/t1_9.c.o
[ 29%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/t2_10.c.o
[ 30%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/t2_16.c.o
[ 30%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/t2_20.c.o
[ 30%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/t2_25.c.o
[ 30%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/t2_32.c.o
[ 30%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/t2_4.c.o
[ 31%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/t2_5.c.o
[ 31%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/t2_64.c.o
[ 31%] Building C object CMakeFiles/fftw3f.dir/dft/scalar/codelets/t2_8.c.o
[ 31%] Building C object CMakeFiles/fftw3f.dir/kernel/align.c.o
[ 31%] Building C object CMakeFiles/fftw3f.dir/kernel/alloc.c.o
[ 32%] Building C object CMakeFiles/fftw3f.dir/kernel/awake.c.o
[ 32%] Building C object CMakeFiles/fftw3f.dir/kernel/assert.c.o
[ 32%] Building C object CMakeFiles/fftw3f.dir/kernel/buffered.c.o
[ 32%] Building C object CMakeFiles/fftw3f.dir/kernel/cpy1d.c.o
[ 33%] Building C object CMakeFiles/fftw3f.dir/kernel/cpy2d-pair.c.o
[ 33%] Building C object CMakeFiles/fftw3f.dir/kernel/cpy2d.c.o
[ 33%] Building C object CMakeFiles/fftw3f.dir/kernel/ct.c.o
[ 33%] Building C object CMakeFiles/fftw3f.dir/kernel/debug.c.o
[ 33%] Building C object CMakeFiles/fftw3f.dir/kernel/extract-reim.c.o
[ 34%] Building C object CMakeFiles/fftw3f.dir/kernel/hash.c.o
[ 34%] Building C object CMakeFiles/fftw3f.dir/kernel/iabs.c.o
[ 34%] Building C object CMakeFiles/fftw3f.dir/kernel/kalloc.c.o
[ 34%] Building C object CMakeFiles/fftw3f.dir/kernel/md5-1.c.o
[ 34%] Building C object CMakeFiles/fftw3f.dir/kernel/md5.c.o
[ 35%] Building C object CMakeFiles/fftw3f.dir/kernel/minmax.c.o
[ 35%] Building C object CMakeFiles/fftw3f.dir/kernel/ops.c.o
[ 35%] Building C object CMakeFiles/fftw3f.dir/kernel/pickdim.c.o
[ 35%] Building C object CMakeFiles/fftw3f.dir/kernel/plan.c.o
[ 36%] Building C object CMakeFiles/fftw3f.dir/kernel/planner.c.o
[ 36%] Building C object CMakeFiles/fftw3f.dir/kernel/primes.c.o
[ 36%] Building C object CMakeFiles/fftw3f.dir/kernel/print.c.o
[ 36%] Building C object CMakeFiles/fftw3f.dir/kernel/problem.c.o
[ 36%] Building C object CMakeFiles/fftw3f.dir/kernel/rader.c.o
[ 37%] Building C object CMakeFiles/fftw3f.dir/kernel/scan.c.o
[ 37%] Building C object CMakeFiles/fftw3f.dir/kernel/solver.c.o
[ 37%] Building C object CMakeFiles/fftw3f.dir/kernel/solvtab.c.o
[ 37%] Building C object CMakeFiles/fftw3f.dir/kernel/stride.c.o
[ 37%] Building C object CMakeFiles/fftw3f.dir/kernel/tensor.c.o
[ 38%] Building C object CMakeFiles/fftw3f.dir/kernel/tensor1.c.o
[ 38%] Building C object CMakeFiles/fftw3f.dir/kernel/tensor2.c.o
[ 38%] Building C object CMakeFiles/fftw3f.dir/kernel/tensor3.c.o
[ 38%] Building C object CMakeFiles/fftw3f.dir/kernel/tensor4.c.o
[ 39%] Building C object CMakeFiles/fftw3f.dir/kernel/tensor5.c.o
[ 39%] Building C object CMakeFiles/fftw3f.dir/kernel/tensor7.c.o
[ 39%] Building C object CMakeFiles/fftw3f.dir/kernel/tensor8.c.o
[ 39%] Building C object CMakeFiles/fftw3f.dir/kernel/tensor9.c.o
[ 39%] Building C object CMakeFiles/fftw3f.dir/kernel/tile2d.c.o
[ 40%] Building C object CMakeFiles/fftw3f.dir/kernel/timer.c.o
[ 40%] Building C object CMakeFiles/fftw3f.dir/kernel/transpose.c.o
[ 40%] Building C object CMakeFiles/fftw3f.dir/kernel/trig.c.o
[ 40%] Building C object CMakeFiles/fftw3f.dir/kernel/twiddle.c.o
[ 40%] Building C object CMakeFiles/fftw3f.dir/rdft/buffered.c.o
[ 40%] Building C object CMakeFiles/fftw3f.dir/rdft/conf.c.o
[ 41%] Building C object CMakeFiles/fftw3f.dir/rdft/buffered2.c.o
[ 41%] Building C object CMakeFiles/fftw3f.dir/rdft/ct-hc2c-direct.c.o
[ 41%] Building C object CMakeFiles/fftw3f.dir/rdft/ct-hc2c.c.o
[ 42%] Building C object CMakeFiles/fftw3f.dir/rdft/dft-r2hc.c.o
[ 42%] Building C object CMakeFiles/fftw3f.dir/rdft/dht-r2hc.c.o
[ 42%] Building C object CMakeFiles/fftw3f.dir/rdft/dht-rader.c.o
[ 42%] Building C object CMakeFiles/fftw3f.dir/rdft/direct-r2c.c.o
[ 42%] Building C object CMakeFiles/fftw3f.dir/rdft/direct-r2r.c.o
[ 43%] Building C object CMakeFiles/fftw3f.dir/rdft/direct2.c.o
[ 43%] Building C object CMakeFiles/fftw3f.dir/rdft/generic.c.o
[ 43%] Building C object CMakeFiles/fftw3f.dir/rdft/hc2hc-direct.c.o
[ 43%] Building C object CMakeFiles/fftw3f.dir/rdft/hc2hc-generic.c.o
[ 43%] Building C object CMakeFiles/fftw3f.dir/rdft/hc2hc.c.o
[ 44%] Building C object CMakeFiles/fftw3f.dir/rdft/khc2c.c.o
[ 44%] Building C object CMakeFiles/fftw3f.dir/rdft/khc2hc.c.o
[ 44%] Building C object CMakeFiles/fftw3f.dir/rdft/indirect.c.o
[ 44%] Building C object CMakeFiles/fftw3f.dir/rdft/kr2c.c.o
[ 45%] Building C object CMakeFiles/fftw3f.dir/rdft/kr2r.c.o
[ 45%] Building C object CMakeFiles/fftw3f.dir/rdft/nop.c.o
[ 45%] Building C object CMakeFiles/fftw3f.dir/rdft/nop2.c.o
[ 45%] Building C object CMakeFiles/fftw3f.dir/rdft/plan.c.o
[ 45%] Building C object CMakeFiles/fftw3f.dir/rdft/plan2.c.o
[ 46%] Building C object CMakeFiles/fftw3f.dir/rdft/problem.c.o
[ 46%] Building C object CMakeFiles/fftw3f.dir/rdft/problem2.c.o
[ 46%] Building C object CMakeFiles/fftw3f.dir/rdft/rank-geq2-rdft2.c.o
[ 46%] Building C object CMakeFiles/fftw3f.dir/rdft/rank-geq2.c.o
[ 46%] Building C object CMakeFiles/fftw3f.dir/rdft/rank0-rdft2.c.o
[ 46%] Building C object CMakeFiles/fftw3f.dir/rdft/rdft-dht.c.o
[ 47%] Building C object CMakeFiles/fftw3f.dir/rdft/rank0.c.o
[ 47%] Building C object CMakeFiles/fftw3f.dir/rdft/rdft2-inplace-strides.c.o
[ 47%] Building C object CMakeFiles/fftw3f.dir/rdft/rdft2-rdft.c.o
[ 48%] Building C object CMakeFiles/fftw3f.dir/rdft/rdft2-strides.c.o
[ 48%] Building C object CMakeFiles/fftw3f.dir/rdft/rdft2-tensor-max-index.c.o
[ 48%] Building C object CMakeFiles/fftw3f.dir/rdft/solve.c.o
[ 48%] Building C object CMakeFiles/fftw3f.dir/rdft/solve2.c.o
[ 48%] Building C object CMakeFiles/fftw3f.dir/rdft/vrank-geq1-rdft2.c.o
[ 49%] Building C object CMakeFiles/fftw3f.dir/rdft/vrank-geq1.c.o
[ 49%] Building C object CMakeFiles/fftw3f.dir/rdft/vrank3-transpose.c.o
[ 49%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/hc2c.c.o
[ 49%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/hfb.c.o
[ 49%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2c.c.o
[ 49%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/codlist.c.o
[ 50%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2r.c.o
[ 50%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hb2_16.c.o
[ 50%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hb2_20.c.o
[ 51%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hb2_32.c.o
[ 51%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hb2_25.c.o
[ 51%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hb2_4.c.o
[ 51%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hb2_5.c.o
[ 51%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hb2_8.c.o
[ 52%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hb_10.c.o
[ 52%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hb_12.c.o
[ 52%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hb_15.c.o
[ 52%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hb_16.c.o
[ 52%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hb_2.c.o
[ 53%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hb_20.c.o
[ 53%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hb_25.c.o
[ 53%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hb_3.c.o
[ 53%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hb_32.c.o
[ 54%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hb_4.c.o
[ 54%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hb_5.c.o
[ 54%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hb_6.c.o
[ 54%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hb_64.c.o
[ 54%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hb_7.c.o
[ 55%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hb_8.c.o
[ 55%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hb_9.c.o
[ 55%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hc2cb2_16.c.o
[ 55%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hc2cb2_20.c.o
[ 55%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hc2cb2_32.c.o
[ 56%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hc2cb2_4.c.o
[ 56%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hc2cb2_8.c.o
[ 56%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hc2cb_10.c.o
[ 56%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hc2cb_12.c.o
[ 57%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hc2cb_16.c.o
[ 57%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hc2cb_2.c.o
[ 57%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hc2cb_20.c.o
[ 57%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hc2cb_32.c.o
[ 57%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hc2cb_4.c.o
[ 58%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hc2cb_6.c.o
[ 58%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hc2cb_8.c.o
[ 58%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hc2cbdft2_16.c.o
[ 58%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hc2cbdft2_20.c.o
[ 58%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hc2cbdft2_32.c.o
[ 59%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hc2cbdft2_4.c.o
[ 59%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hc2cbdft2_8.c.o
[ 59%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hc2cbdft_10.c.o
[ 59%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hc2cbdft_12.c.o
[ 60%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hc2cbdft_16.c.o
[ 60%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hc2cbdft_2.c.o
[ 60%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hc2cbdft_20.c.o
[ 60%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hc2cbdft_32.c.o
[ 60%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hc2cbdft_4.c.o
[ 61%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hc2cbdft_6.c.o
[ 61%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/hc2cbdft_8.c.o
[ 61%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/r2cbIII_10.c.o
[ 61%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/r2cbIII_12.c.o
[ 61%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/r2cbIII_15.c.o
[ 62%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/r2cbIII_16.c.o
[ 62%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/r2cbIII_2.c.o
[ 62%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/r2cbIII_20.c.o
[ 62%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/r2cbIII_25.c.o
[ 63%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/r2cbIII_3.c.o
[ 63%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/r2cbIII_32.c.o
[ 63%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/r2cbIII_4.c.o
[ 63%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/r2cbIII_5.c.o
[ 63%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/r2cbIII_6.c.o
[ 64%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/r2cbIII_64.c.o
[ 64%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/r2cbIII_7.c.o
[ 64%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/r2cbIII_8.c.o
[ 64%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/r2cbIII_9.c.o
[ 64%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/r2cb_10.c.o
[ 65%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/r2cb_11.c.o
[ 65%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/r2cb_12.c.o
[ 65%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/r2cb_128.c.o
[ 65%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/r2cb_13.c.o
[ 66%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/r2cb_14.c.o
[ 66%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/r2cb_15.c.o
[ 66%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/r2cb_16.c.o
[ 66%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/r2cb_2.c.o
[ 66%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/r2cb_20.c.o
[ 67%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/r2cb_25.c.o
[ 67%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/r2cb_3.c.o
[ 67%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/r2cb_32.c.o
[ 67%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/r2cb_4.c.o
[ 67%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/r2cb_5.c.o
[ 68%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/r2cb_6.c.o
[ 68%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/r2cb_64.c.o
[ 68%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/r2cb_7.c.o
[ 68%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/r2cb_8.c.o
[ 69%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cb/r2cb_9.c.o
[ 69%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/codlist.c.o
[ 69%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hc2cf2_16.c.o
[ 69%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hc2cf2_20.c.o
[ 69%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hc2cf2_32.c.o
[ 70%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hc2cf2_4.c.o
[ 70%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hc2cf_10.c.o
[ 70%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hc2cf2_8.c.o
[ 70%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hc2cf_12.c.o
[ 70%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hc2cf_16.c.o
[ 71%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hc2cf_20.c.o
[ 71%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hc2cf_2.c.o
[ 71%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hc2cf_32.c.o
[ 71%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hc2cf_4.c.o
[ 72%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hc2cf_6.c.o
[ 72%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hc2cf_8.c.o
[ 72%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hc2cfdft2_16.c.o
[ 72%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hc2cfdft2_20.c.o
[ 72%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hc2cfdft2_32.c.o
[ 73%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hc2cfdft2_4.c.o
[ 73%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hc2cfdft2_8.c.o
[ 73%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hc2cfdft_10.c.o
[ 73%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hc2cfdft_12.c.o
[ 73%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hc2cfdft_16.c.o
[ 74%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hc2cfdft_2.c.o
[ 74%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hc2cfdft_20.c.o
[ 74%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hc2cfdft_32.c.o
[ 74%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hc2cfdft_4.c.o
[ 75%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hc2cfdft_6.c.o
[ 75%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hc2cfdft_8.c.o
[ 75%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hf2_16.c.o
[ 75%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hf2_20.c.o
[ 75%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hf2_25.c.o
[ 76%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hf2_32.c.o
[ 76%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hf2_4.c.o
[ 76%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hf2_5.c.o
[ 76%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hf2_8.c.o
[ 76%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hf_10.c.o
[ 77%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hf_12.c.o
[ 77%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hf_15.c.o
[ 77%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hf_16.c.o
[ 77%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hf_2.c.o
[ 78%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hf_20.c.o
[ 78%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hf_25.c.o
[ 78%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hf_3.c.o
[ 78%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hf_32.c.o
[ 78%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hf_4.c.o
[ 79%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hf_5.c.o
[ 79%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hf_6.c.o
[ 79%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hf_64.c.o
[ 79%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hf_7.c.o
[ 79%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hf_8.c.o
[ 80%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/hf_9.c.o
[ 80%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/r2cfII_10.c.o
[ 80%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/r2cfII_12.c.o
[ 80%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/r2cfII_15.c.o
[ 81%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/r2cfII_16.c.o
[ 81%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/r2cfII_2.c.o
[ 81%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/r2cfII_20.c.o
[ 81%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/r2cfII_25.c.o
[ 81%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/r2cfII_3.c.o
[ 82%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/r2cfII_32.c.o
[ 82%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/r2cfII_4.c.o
[ 82%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/r2cfII_5.c.o
[ 82%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/r2cfII_6.c.o
[ 82%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/r2cfII_64.c.o
[ 83%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/r2cfII_7.c.o
[ 83%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/r2cfII_8.c.o
[ 83%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/r2cfII_9.c.o
[ 83%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/r2cf_10.c.o
[ 84%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/r2cf_11.c.o
[ 84%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/r2cf_12.c.o
[ 84%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/r2cf_128.c.o
[ 84%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/r2cf_13.c.o
[ 84%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/r2cf_14.c.o
[ 85%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/r2cf_15.c.o
[ 85%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/r2cf_16.c.o
[ 85%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/r2cf_2.c.o
[ 85%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/r2cf_20.c.o
[ 85%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/r2cf_25.c.o
[ 86%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/r2cf_3.c.o
[ 86%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/r2cf_32.c.o
[ 86%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/r2cf_4.c.o
[ 87%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/r2cf_6.c.o
[ 87%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/r2cf_5.c.o
[ 87%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/r2cf_64.c.o
[ 87%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/r2cf_7.c.o
[ 87%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/r2cf_8.c.o
[ 87%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2cf/r2cf_9.c.o
[ 88%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2r/codlist.c.o
[ 88%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2r/e01_8.c.o
[ 88%] Building C object CMakeFiles/fftw3f.dir/rdft/scalar/r2r/e10_8.c.o
[ 88%] Building C object CMakeFiles/fftw3f.dir/reodft/conf.c.o
[ 88%] Building C object CMakeFiles/fftw3f.dir/reodft/redft00e-r2hc-pad.c.o
[ 89%] Building C object CMakeFiles/fftw3f.dir/reodft/reodft00e-splitradix.c.o
[ 89%] Building C object CMakeFiles/fftw3f.dir/reodft/redft00e-r2hc.c.o
[ 89%] Building C object CMakeFiles/fftw3f.dir/reodft/reodft010e-r2hc.c.o
[ 89%] Building C object CMakeFiles/fftw3f.dir/reodft/reodft11e-r2hc-odd.c.o
[ 90%] Building C object CMakeFiles/fftw3f.dir/reodft/reodft11e-r2hc.c.o
[ 90%] Building C object CMakeFiles/fftw3f.dir/reodft/reodft11e-radix2.c.o
[ 90%] Building C object CMakeFiles/fftw3f.dir/reodft/rodft00e-r2hc-pad.c.o
[ 90%] Building C object CMakeFiles/fftw3f.dir/reodft/rodft00e-r2hc.c.o
[ 90%] Building C object CMakeFiles/fftw3f.dir/simd-support/altivec.c.o
[ 91%] Building C object CMakeFiles/fftw3f.dir/simd-support/avx-128-fma.c.o
[ 91%] Building C object CMakeFiles/fftw3f.dir/simd-support/avx.c.o
[ 91%] Building C object CMakeFiles/fftw3f.dir/simd-support/avx2.c.o
[ 91%] Building C object CMakeFiles/fftw3f.dir/simd-support/avx512.c.o
[ 91%] Building C object CMakeFiles/fftw3f.dir/simd-support/kcvi.c.o
[ 92%] Building C object CMakeFiles/fftw3f.dir/simd-support/neon.c.o
[ 92%] Building C object CMakeFiles/fftw3f.dir/simd-support/sse2.c.o
[ 92%] Building C object CMakeFiles/fftw3f.dir/simd-support/taint.c.o
[ 92%] Building C object CMakeFiles/fftw3f.dir/simd-support/vsx.c.o
[ 93%] Linking C shared library libfftw3f.so
[ 93%] Built target fftw3f
Scanning dependencies of target bench
[ 93%] Building C object CMakeFiles/bench.dir/libbench2/after-ccopy-from.c.o
[ 93%] Building C object CMakeFiles/bench.dir/libbench2/after-ccopy-to.c.o
[ 93%] Building C object CMakeFiles/bench.dir/libbench2/after-hccopy-from.c.o
[ 93%] Building C object CMakeFiles/bench.dir/libbench2/after-hccopy-to.c.o
[ 94%] Building C object CMakeFiles/bench.dir/libbench2/after-rcopy-from.c.o
[ 94%] Building C object CMakeFiles/bench.dir/libbench2/after-rcopy-to.c.o
[ 94%] Building C object CMakeFiles/bench.dir/libbench2/aset.c.o
[ 94%] Building C object CMakeFiles/bench.dir/libbench2/allocate.c.o
[ 94%] Building C object CMakeFiles/bench.dir/libbench2/bench-cost-postprocess.c.o
[ 95%] Building C object CMakeFiles/bench.dir/libbench2/bench-exit.c.o
[ 95%] Building C object CMakeFiles/bench.dir/libbench2/bench-main.c.o
[ 95%] Building C object CMakeFiles/bench.dir/libbench2/can-do.c.o
[ 95%] Building C object CMakeFiles/bench.dir/libbench2/caset.c.o
[ 95%] Building C object CMakeFiles/bench.dir/libbench2/dotens2.c.o
[ 96%] Building C object CMakeFiles/bench.dir/libbench2/info.c.o
[ 96%] Building C object CMakeFiles/bench.dir/libbench2/main.c.o
[ 96%] Building C object CMakeFiles/bench.dir/libbench2/mflops.c.o
[ 96%] Building C object CMakeFiles/bench.dir/libbench2/mp.c.o
[ 97%] Building C object CMakeFiles/bench.dir/libbench2/ovtpvt.c.o
[ 97%] Building C object CMakeFiles/bench.dir/libbench2/my-getopt.c.o
[ 97%] Building C object CMakeFiles/bench.dir/libbench2/pow2.c.o
[ 97%] Building C object CMakeFiles/bench.dir/libbench2/problem.c.o
[ 97%] Building C object CMakeFiles/bench.dir/libbench2/report.c.o
[ 98%] Building C object CMakeFiles/bench.dir/libbench2/speed.c.o
[ 98%] Building C object CMakeFiles/bench.dir/libbench2/tensor.c.o
[ 98%] Building C object CMakeFiles/bench.dir/libbench2/timer.c.o
[ 98%] Building C object CMakeFiles/bench.dir/libbench2/util.c.o
[ 98%] Building C object CMakeFiles/bench.dir/libbench2/verify-dft.c.o
[ 99%] Building C object CMakeFiles/bench.dir/libbench2/verify-lib.c.o
[ 99%] Building C object CMakeFiles/bench.dir/libbench2/verify-r2r.c.o
[ 99%] Building C object CMakeFiles/bench.dir/libbench2/verify-rdft2.c.o
[ 99%] Building C object CMakeFiles/bench.dir/libbench2/verify.c.o
[100%] Building C object CMakeFiles/bench.dir/libbench2/zero.c.o
[100%] Building C object CMakeFiles/bench.dir/tests/bench.c.o
[100%] Building C object CMakeFiles/bench.dir/tests/hook.c.o
[100%] Building C object CMakeFiles/bench.dir/tests/fftw-bench.c.o
[100%] Linking C executable bench
[100%] Built target bench
Install the project...
-- Install configuration: "Debug"
-- Installing: /opt/fftw/lib/libfftw3f.so.3
-- Installing: /opt/fftw/lib/libfftw3f.so.3.5.7
-- Installing: /opt/fftw/lib/libfftw3f.so
-- Up-to-date: /opt/fftw/lib/libfftw3f.so.3
-- Up-to-date: /opt/fftw/lib/libfftw3f.so.3.5.7
-- Up-to-date: /opt/fftw/lib/libfftw3f.so
-- Installing: /opt/fftw/include/fftw3.h
-- Installing: /opt/fftw/include/fftw3.f
-- Installing: /opt/fftw/include/fftw3l.f03
-- Installing: /opt/fftw/include/fftw3q.f03
-- Installing: /opt/fftw/include/fftw3.f03
-- Installing: /opt/fftw/lib/pkgconfig/fftwf.pc
-- Installing: /opt/fftw/lib/cmake/fftw3f/FFTW3fConfig.cmake
-- Installing: /opt/fftw/lib/cmake/fftw3f/FFTW3fConfigVersion.cmake
-- Installing: /opt/fftw/lib/cmake/fftw3f/FFTW3LibraryDepends.cmake
-- Installing: /opt/fftw/lib/cmake/fftw3f/FFTW3LibraryDepends-debug.cmake
-- Configuring done
-- Generating done
-- Build files have been written to: /scratch/fftw/build
Scanning dependencies of target fftw3
[  1%] Building C object CMakeFiles/fftw3.dir/api/configure.c.o
[  1%] Building C object CMakeFiles/fftw3.dir/api/apiplan.c.o
[  1%] Building C object CMakeFiles/fftw3.dir/api/execute-dft-r2c.c.o
[  1%] Building C object CMakeFiles/fftw3.dir/api/execute-dft-c2r.c.o
[  2%] Building C object CMakeFiles/fftw3.dir/api/execute-r2r.c.o
[  2%] Building C object CMakeFiles/fftw3.dir/api/execute-dft.c.o
[  2%] Building C object CMakeFiles/fftw3.dir/api/execute-split-dft-c2r.c.o
[  2%] Building C object CMakeFiles/fftw3.dir/api/execute-split-dft-r2c.c.o
[  2%] Building C object CMakeFiles/fftw3.dir/api/execute-split-dft.c.o
[  3%] Building C object CMakeFiles/fftw3.dir/api/execute.c.o
[  3%] Building C object CMakeFiles/fftw3.dir/api/export-wisdom-to-file.c.o
[  3%] Building C object CMakeFiles/fftw3.dir/api/export-wisdom-to-string.c.o
[  3%] Building C object CMakeFiles/fftw3.dir/api/export-wisdom.c.o
[  3%] Building C object CMakeFiles/fftw3.dir/api/f77api.c.o
[  4%] Building C object CMakeFiles/fftw3.dir/api/flops.c.o
[  4%] Building C object CMakeFiles/fftw3.dir/api/forget-wisdom.c.o
[  4%] Building C object CMakeFiles/fftw3.dir/api/import-system-wisdom.c.o
[  4%] Building C object CMakeFiles/fftw3.dir/api/import-wisdom-from-file.c.o
[  4%] Building C object CMakeFiles/fftw3.dir/api/import-wisdom-from-string.c.o
[  5%] Building C object CMakeFiles/fftw3.dir/api/import-wisdom.c.o
[  5%] Building C object CMakeFiles/fftw3.dir/api/malloc.c.o
[  5%] Building C object CMakeFiles/fftw3.dir/api/map-r2r-kind.c.o
[  5%] Building C object CMakeFiles/fftw3.dir/api/mapflags.c.o
[  6%] Building C object CMakeFiles/fftw3.dir/api/mkprinter-file.c.o
[  6%] Building C object CMakeFiles/fftw3.dir/api/mkprinter-str.c.o
[  6%] Building C object CMakeFiles/fftw3.dir/api/mktensor-iodims.c.o
[  6%] Building C object CMakeFiles/fftw3.dir/api/mktensor-iodims64.c.o
[  6%] Building C object CMakeFiles/fftw3.dir/api/mktensor-rowmajor.c.o
[  7%] Building C object CMakeFiles/fftw3.dir/api/plan-dft-1d.c.o
[  7%] Building C object CMakeFiles/fftw3.dir/api/plan-dft-2d.c.o
[  7%] Building C object CMakeFiles/fftw3.dir/api/plan-dft-3d.c.o
[  7%] Building C object CMakeFiles/fftw3.dir/api/plan-dft-c2r-1d.c.o
[  7%] Building C object CMakeFiles/fftw3.dir/api/plan-dft-c2r-2d.c.o
[  8%] Building C object CMakeFiles/fftw3.dir/api/plan-dft-c2r-3d.c.o
[  8%] Building C object CMakeFiles/fftw3.dir/api/plan-dft-c2r.c.o
[  8%] Building C object CMakeFiles/fftw3.dir/api/plan-dft-r2c-1d.c.o
[  8%] Building C object CMakeFiles/fftw3.dir/api/plan-dft-r2c-2d.c.o
[  9%] Building C object CMakeFiles/fftw3.dir/api/plan-dft-r2c-3d.c.o
[  9%] Building C object CMakeFiles/fftw3.dir/api/plan-dft-r2c.c.o
[  9%] Building C object CMakeFiles/fftw3.dir/api/plan-dft.c.o
[  9%] Building C object CMakeFiles/fftw3.dir/api/plan-guru-dft-c2r.c.o
[  9%] Building C object CMakeFiles/fftw3.dir/api/plan-guru-dft-r2c.c.o
[ 10%] Building C object CMakeFiles/fftw3.dir/api/plan-guru-dft.c.o
[ 10%] Building C object CMakeFiles/fftw3.dir/api/plan-guru-r2r.c.o
[ 10%] Building C object CMakeFiles/fftw3.dir/api/plan-guru-split-dft-c2r.c.o
[ 10%] Building C object CMakeFiles/fftw3.dir/api/plan-guru-split-dft-r2c.c.o
[ 10%] Building C object CMakeFiles/fftw3.dir/api/plan-guru-split-dft.c.o
[ 11%] Building C object CMakeFiles/fftw3.dir/api/plan-guru64-dft-c2r.c.o
[ 11%] Building C object CMakeFiles/fftw3.dir/api/plan-guru64-dft-r2c.c.o
[ 11%] Building C object CMakeFiles/fftw3.dir/api/plan-guru64-dft.c.o
[ 11%] Building C object CMakeFiles/fftw3.dir/api/plan-guru64-r2r.c.o
[ 12%] Building C object CMakeFiles/fftw3.dir/api/plan-guru64-split-dft-c2r.c.o
[ 12%] Building C object CMakeFiles/fftw3.dir/api/plan-guru64-split-dft-r2c.c.o
[ 12%] Building C object CMakeFiles/fftw3.dir/api/plan-guru64-split-dft.c.o
[ 12%] Building C object CMakeFiles/fftw3.dir/api/plan-many-dft-c2r.c.o
[ 12%] Building C object CMakeFiles/fftw3.dir/api/plan-many-dft-r2c.c.o
[ 13%] Building C object CMakeFiles/fftw3.dir/api/plan-many-dft.c.o
[ 13%] Building C object CMakeFiles/fftw3.dir/api/plan-many-r2r.c.o
[ 13%] Building C object CMakeFiles/fftw3.dir/api/plan-r2r-1d.c.o
[ 13%] Building C object CMakeFiles/fftw3.dir/api/plan-r2r-2d.c.o
[ 13%] Building C object CMakeFiles/fftw3.dir/api/plan-r2r-3d.c.o
[ 14%] Building C object CMakeFiles/fftw3.dir/api/plan-r2r.c.o
[ 14%] Building C object CMakeFiles/fftw3.dir/api/print-plan.c.o
[ 14%] Building C object CMakeFiles/fftw3.dir/api/rdft2-pad.c.o
[ 14%] Building C object CMakeFiles/fftw3.dir/api/the-planner.c.o
[ 15%] Building C object CMakeFiles/fftw3.dir/api/version.c.o
[ 15%] Building C object CMakeFiles/fftw3.dir/dft/bluestein.c.o
[ 15%] Building C object CMakeFiles/fftw3.dir/dft/buffered.c.o
[ 15%] Building C object CMakeFiles/fftw3.dir/dft/conf.c.o
[ 15%] Building C object CMakeFiles/fftw3.dir/dft/ct.c.o
[ 16%] Building C object CMakeFiles/fftw3.dir/dft/dftw-direct.c.o
[ 16%] Building C object CMakeFiles/fftw3.dir/dft/dftw-directsq.c.o
[ 16%] Building C object CMakeFiles/fftw3.dir/dft/dftw-generic.c.o
[ 16%] Building C object CMakeFiles/fftw3.dir/dft/dftw-genericbuf.c.o
[ 16%] Building C object CMakeFiles/fftw3.dir/dft/direct.c.o
[ 17%] Building C object CMakeFiles/fftw3.dir/dft/generic.c.o
[ 17%] Building C object CMakeFiles/fftw3.dir/dft/indirect-transpose.c.o
[ 17%] Building C object CMakeFiles/fftw3.dir/dft/indirect.c.o
[ 17%] Building C object CMakeFiles/fftw3.dir/dft/kdft-dif.c.o
[ 18%] Building C object CMakeFiles/fftw3.dir/dft/kdft-difsq.c.o
[ 18%] Building C object CMakeFiles/fftw3.dir/dft/kdft-dit.c.o
[ 18%] Building C object CMakeFiles/fftw3.dir/dft/kdft.c.o
[ 18%] Building C object CMakeFiles/fftw3.dir/dft/nop.c.o
[ 18%] Building C object CMakeFiles/fftw3.dir/dft/plan.c.o
[ 19%] Building C object CMakeFiles/fftw3.dir/dft/problem.c.o
[ 19%] Building C object CMakeFiles/fftw3.dir/dft/rader.c.o
[ 19%] Building C object CMakeFiles/fftw3.dir/dft/rank-geq2.c.o
[ 19%] Building C object CMakeFiles/fftw3.dir/dft/solve.c.o
[ 19%] Building C object CMakeFiles/fftw3.dir/dft/vrank-geq1.c.o
[ 20%] Building C object CMakeFiles/fftw3.dir/dft/zero.c.o
[ 20%] Building C object CMakeFiles/fftw3.dir/dft/scalar/n.c.o
[ 20%] Building C object CMakeFiles/fftw3.dir/dft/scalar/t.c.o
[ 20%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/codlist.c.o
[ 21%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/n1_10.c.o
[ 21%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/n1_11.c.o
[ 21%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/n1_12.c.o
[ 21%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/n1_13.c.o
[ 21%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/n1_14.c.o
[ 22%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/n1_15.c.o
[ 22%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/n1_16.c.o
[ 22%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/n1_2.c.o
[ 22%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/n1_20.c.o
[ 22%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/n1_25.c.o
[ 23%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/n1_3.c.o
[ 23%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/n1_32.c.o
[ 23%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/n1_4.c.o
[ 23%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/n1_5.c.o
[ 24%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/n1_6.c.o
[ 24%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/n1_64.c.o
[ 24%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/n1_7.c.o
[ 24%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/n1_8.c.o
[ 24%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/n1_9.c.o
[ 25%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/q1_2.c.o
[ 25%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/q1_3.c.o
[ 25%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/q1_4.c.o
[ 25%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/q1_5.c.o
[ 25%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/q1_6.c.o
[ 26%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/q1_8.c.o
[ 26%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/t1_10.c.o
[ 26%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/t1_12.c.o
[ 26%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/t1_15.c.o
[ 27%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/t1_16.c.o
[ 27%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/t1_2.c.o
[ 27%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/t1_20.c.o
[ 27%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/t1_25.c.o
[ 27%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/t1_3.c.o
[ 28%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/t1_32.c.o
[ 28%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/t1_4.c.o
[ 28%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/t1_5.c.o
[ 28%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/t1_6.c.o
[ 28%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/t1_64.c.o
[ 29%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/t1_8.c.o
[ 29%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/t1_7.c.o
[ 29%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/t1_9.c.o
[ 30%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/t2_10.c.o
[ 30%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/t2_16.c.o
[ 30%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/t2_20.c.o
[ 30%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/t2_25.c.o
[ 30%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/t2_32.c.o
[ 30%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/t2_4.c.o
[ 31%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/t2_5.c.o
[ 31%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/t2_64.c.o
[ 31%] Building C object CMakeFiles/fftw3.dir/dft/scalar/codelets/t2_8.c.o
[ 31%] Building C object CMakeFiles/fftw3.dir/kernel/align.c.o
[ 31%] Building C object CMakeFiles/fftw3.dir/kernel/alloc.c.o
[ 32%] Building C object CMakeFiles/fftw3.dir/kernel/assert.c.o
[ 32%] Building C object CMakeFiles/fftw3.dir/kernel/awake.c.o
[ 32%] Building C object CMakeFiles/fftw3.dir/kernel/buffered.c.o
[ 32%] Building C object CMakeFiles/fftw3.dir/kernel/cpy1d.c.o
[ 33%] Building C object CMakeFiles/fftw3.dir/kernel/cpy2d-pair.c.o
[ 33%] Building C object CMakeFiles/fftw3.dir/kernel/cpy2d.c.o
[ 33%] Building C object CMakeFiles/fftw3.dir/kernel/ct.c.o
[ 33%] Building C object CMakeFiles/fftw3.dir/kernel/debug.c.o
[ 33%] Building C object CMakeFiles/fftw3.dir/kernel/extract-reim.c.o
[ 34%] Building C object CMakeFiles/fftw3.dir/kernel/hash.c.o
[ 34%] Building C object CMakeFiles/fftw3.dir/kernel/iabs.c.o
[ 34%] Building C object CMakeFiles/fftw3.dir/kernel/kalloc.c.o
[ 34%] Building C object CMakeFiles/fftw3.dir/kernel/md5.c.o
[ 34%] Building C object CMakeFiles/fftw3.dir/kernel/md5-1.c.o
[ 35%] Building C object CMakeFiles/fftw3.dir/kernel/minmax.c.o
[ 35%] Building C object CMakeFiles/fftw3.dir/kernel/ops.c.o
[ 35%] Building C object CMakeFiles/fftw3.dir/kernel/pickdim.c.o
[ 35%] Building C object CMakeFiles/fftw3.dir/kernel/plan.c.o
[ 36%] Building C object CMakeFiles/fftw3.dir/kernel/primes.c.o
[ 36%] Building C object CMakeFiles/fftw3.dir/kernel/planner.c.o
[ 36%] Building C object CMakeFiles/fftw3.dir/kernel/print.c.o
[ 36%] Building C object CMakeFiles/fftw3.dir/kernel/problem.c.o
[ 36%] Building C object CMakeFiles/fftw3.dir/kernel/rader.c.o
[ 37%] Building C object CMakeFiles/fftw3.dir/kernel/scan.c.o
[ 37%] Building C object CMakeFiles/fftw3.dir/kernel/solver.c.o
[ 37%] Building C object CMakeFiles/fftw3.dir/kernel/solvtab.c.o
[ 37%] Building C object CMakeFiles/fftw3.dir/kernel/stride.c.o
[ 37%] Building C object CMakeFiles/fftw3.dir/kernel/tensor.c.o
[ 38%] Building C object CMakeFiles/fftw3.dir/kernel/tensor1.c.o
[ 38%] Building C object CMakeFiles/fftw3.dir/kernel/tensor2.c.o
[ 38%] Building C object CMakeFiles/fftw3.dir/kernel/tensor3.c.o
[ 38%] Building C object CMakeFiles/fftw3.dir/kernel/tensor4.c.o
[ 39%] Building C object CMakeFiles/fftw3.dir/kernel/tensor5.c.o
[ 39%] Building C object CMakeFiles/fftw3.dir/kernel/tensor7.c.o
[ 39%] Building C object CMakeFiles/fftw3.dir/kernel/tensor8.c.o
[ 39%] Building C object CMakeFiles/fftw3.dir/kernel/tile2d.c.o
[ 39%] Building C object CMakeFiles/fftw3.dir/kernel/tensor9.c.o
[ 40%] Building C object CMakeFiles/fftw3.dir/kernel/timer.c.o
[ 40%] Building C object CMakeFiles/fftw3.dir/kernel/transpose.c.o
[ 40%] Building C object CMakeFiles/fftw3.dir/kernel/trig.c.o
[ 40%] Building C object CMakeFiles/fftw3.dir/kernel/twiddle.c.o
[ 40%] Building C object CMakeFiles/fftw3.dir/rdft/buffered.c.o
[ 41%] Building C object CMakeFiles/fftw3.dir/rdft/buffered2.c.o
[ 41%] Building C object CMakeFiles/fftw3.dir/rdft/ct-hc2c-direct.c.o
[ 41%] Building C object CMakeFiles/fftw3.dir/rdft/conf.c.o
[ 41%] Building C object CMakeFiles/fftw3.dir/rdft/ct-hc2c.c.o
[ 42%] Building C object CMakeFiles/fftw3.dir/rdft/dft-r2hc.c.o
[ 42%] Building C object CMakeFiles/fftw3.dir/rdft/dht-r2hc.c.o
[ 42%] Building C object CMakeFiles/fftw3.dir/rdft/dht-rader.c.o
[ 42%] Building C object CMakeFiles/fftw3.dir/rdft/direct-r2c.c.o
[ 42%] Building C object CMakeFiles/fftw3.dir/rdft/direct-r2r.c.o
[ 43%] Building C object CMakeFiles/fftw3.dir/rdft/direct2.c.o
[ 43%] Building C object CMakeFiles/fftw3.dir/rdft/generic.c.o
[ 43%] Building C object CMakeFiles/fftw3.dir/rdft/hc2hc-direct.c.o
[ 43%] Building C object CMakeFiles/fftw3.dir/rdft/hc2hc-generic.c.o
[ 43%] Building C object CMakeFiles/fftw3.dir/rdft/hc2hc.c.o
[ 44%] Building C object CMakeFiles/fftw3.dir/rdft/indirect.c.o
[ 44%] Building C object CMakeFiles/fftw3.dir/rdft/khc2c.c.o
[ 44%] Building C object CMakeFiles/fftw3.dir/rdft/khc2hc.c.o
[ 44%] Building C object CMakeFiles/fftw3.dir/rdft/kr2c.c.o
[ 45%] Building C object CMakeFiles/fftw3.dir/rdft/kr2r.c.o
[ 45%] Building C object CMakeFiles/fftw3.dir/rdft/nop.c.o
[ 45%] Building C object CMakeFiles/fftw3.dir/rdft/nop2.c.o
[ 45%] Building C object CMakeFiles/fftw3.dir/rdft/plan.c.o
[ 45%] Building C object CMakeFiles/fftw3.dir/rdft/plan2.c.o
[ 46%] Building C object CMakeFiles/fftw3.dir/rdft/problem.c.o
[ 46%] Building C object CMakeFiles/fftw3.dir/rdft/problem2.c.o
[ 46%] Building C object CMakeFiles/fftw3.dir/rdft/rank-geq2-rdft2.c.o
[ 46%] Building C object CMakeFiles/fftw3.dir/rdft/rank-geq2.c.o
[ 46%] Building C object CMakeFiles/fftw3.dir/rdft/rank0-rdft2.c.o
[ 47%] Building C object CMakeFiles/fftw3.dir/rdft/rdft2-inplace-strides.c.o
[ 47%] Building C object CMakeFiles/fftw3.dir/rdft/rank0.c.o
[ 47%] Building C object CMakeFiles/fftw3.dir/rdft/rdft-dht.c.o
[ 47%] Building C object CMakeFiles/fftw3.dir/rdft/rdft2-rdft.c.o
[ 48%] Building C object CMakeFiles/fftw3.dir/rdft/rdft2-strides.c.o
[ 48%] Building C object CMakeFiles/fftw3.dir/rdft/rdft2-tensor-max-index.c.o
[ 48%] Building C object CMakeFiles/fftw3.dir/rdft/solve.c.o
[ 48%] Building C object CMakeFiles/fftw3.dir/rdft/solve2.c.o
[ 48%] Building C object CMakeFiles/fftw3.dir/rdft/vrank-geq1-rdft2.c.o
[ 49%] Building C object CMakeFiles/fftw3.dir/rdft/vrank-geq1.c.o
[ 49%] Building C object CMakeFiles/fftw3.dir/rdft/vrank3-transpose.c.o
[ 49%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/hc2c.c.o
[ 49%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/hfb.c.o
[ 49%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2c.c.o
[ 50%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2r.c.o
[ 50%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/codlist.c.o
[ 50%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hb2_16.c.o
[ 50%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hb2_20.c.o
[ 51%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hb2_25.c.o
[ 51%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hb2_32.c.o
[ 51%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hb2_4.c.o
[ 51%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hb2_5.c.o
[ 51%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hb2_8.c.o
[ 52%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hb_10.c.o
[ 52%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hb_12.c.o
[ 52%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hb_15.c.o
[ 52%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hb_16.c.o
[ 52%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hb_2.c.o
[ 53%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hb_20.c.o
[ 53%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hb_25.c.o
[ 53%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hb_3.c.o
[ 53%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hb_32.c.o
[ 54%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hb_4.c.o
[ 54%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hb_5.c.o
[ 54%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hb_6.c.o
[ 54%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hb_64.c.o
[ 54%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hb_7.c.o
[ 55%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hb_8.c.o
[ 55%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hb_9.c.o
[ 55%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hc2cb2_16.c.o
[ 55%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hc2cb2_20.c.o
[ 55%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hc2cb2_32.c.o
[ 56%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hc2cb2_4.c.o
[ 56%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hc2cb2_8.c.o
[ 56%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hc2cb_10.c.o
[ 56%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hc2cb_12.c.o
[ 57%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hc2cb_16.c.o
[ 57%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hc2cb_2.c.o
[ 57%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hc2cb_20.c.o
[ 57%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hc2cb_32.c.o
[ 57%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hc2cb_4.c.o
[ 58%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hc2cb_6.c.o
[ 58%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hc2cb_8.c.o
[ 58%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hc2cbdft2_16.c.o
[ 58%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hc2cbdft2_20.c.o
[ 58%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hc2cbdft2_32.c.o
[ 59%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hc2cbdft2_4.c.o
[ 59%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hc2cbdft2_8.c.o
[ 59%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hc2cbdft_10.c.o
[ 59%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hc2cbdft_12.c.o
[ 60%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hc2cbdft_16.c.o
[ 60%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hc2cbdft_2.c.o
[ 60%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hc2cbdft_20.c.o
[ 60%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hc2cbdft_32.c.o
[ 60%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hc2cbdft_4.c.o
[ 61%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hc2cbdft_6.c.o
[ 61%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/hc2cbdft_8.c.o
[ 61%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/r2cbIII_10.c.o
[ 61%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/r2cbIII_12.c.o
[ 61%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/r2cbIII_15.c.o
[ 62%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/r2cbIII_16.c.o
[ 62%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/r2cbIII_2.c.o
[ 62%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/r2cbIII_20.c.o
[ 62%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/r2cbIII_25.c.o
[ 63%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/r2cbIII_3.c.o
[ 63%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/r2cbIII_32.c.o
[ 63%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/r2cbIII_4.c.o
[ 63%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/r2cbIII_5.c.o
[ 63%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/r2cbIII_6.c.o
[ 64%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/r2cbIII_64.c.o
[ 64%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/r2cbIII_7.c.o
[ 64%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/r2cbIII_8.c.o
[ 64%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/r2cbIII_9.c.o
[ 64%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/r2cb_10.c.o
[ 65%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/r2cb_11.c.o
[ 65%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/r2cb_12.c.o
[ 65%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/r2cb_128.c.o
[ 65%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/r2cb_13.c.o
[ 66%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/r2cb_14.c.o
[ 66%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/r2cb_15.c.o
[ 66%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/r2cb_16.c.o
[ 66%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/r2cb_2.c.o
[ 66%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/r2cb_20.c.o
[ 67%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/r2cb_25.c.o
[ 67%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/r2cb_3.c.o
[ 67%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/r2cb_32.c.o
[ 67%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/r2cb_4.c.o
[ 67%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/r2cb_5.c.o
[ 68%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/r2cb_6.c.o
[ 68%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/r2cb_64.c.o
[ 68%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/r2cb_7.c.o
[ 68%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/r2cb_8.c.o
[ 69%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cb/r2cb_9.c.o
[ 69%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/codlist.c.o
[ 69%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hc2cf2_16.c.o
[ 69%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hc2cf2_20.c.o
[ 69%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hc2cf2_32.c.o
[ 70%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hc2cf2_4.c.o
[ 70%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hc2cf2_8.c.o
[ 70%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hc2cf_10.c.o
[ 70%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hc2cf_12.c.o
[ 70%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hc2cf_16.c.o
[ 71%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hc2cf_2.c.o
[ 71%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hc2cf_20.c.o
[ 71%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hc2cf_32.c.o
[ 71%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hc2cf_4.c.o
[ 72%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hc2cf_6.c.o
[ 72%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hc2cf_8.c.o
[ 72%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hc2cfdft2_16.c.o
[ 72%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hc2cfdft2_20.c.o
[ 72%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hc2cfdft2_32.c.o
[ 73%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hc2cfdft2_4.c.o
[ 73%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hc2cfdft2_8.c.o
[ 73%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hc2cfdft_10.c.o
[ 73%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hc2cfdft_12.c.o
[ 73%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hc2cfdft_16.c.o
[ 74%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hc2cfdft_2.c.o
[ 74%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hc2cfdft_20.c.o
[ 74%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hc2cfdft_32.c.o
[ 74%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hc2cfdft_4.c.o
[ 75%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hc2cfdft_6.c.o
[ 75%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hc2cfdft_8.c.o
[ 75%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hf2_16.c.o
[ 75%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hf2_20.c.o
[ 75%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hf2_25.c.o
[ 76%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hf2_32.c.o
[ 76%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hf2_4.c.o
[ 76%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hf2_5.c.o
[ 76%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hf2_8.c.o
[ 76%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hf_10.c.o
[ 77%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hf_12.c.o
[ 77%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hf_15.c.o
[ 77%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hf_16.c.o
[ 77%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hf_2.c.o
[ 78%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hf_20.c.o
[ 78%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hf_25.c.o
[ 78%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hf_3.c.o
[ 78%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hf_32.c.o
[ 78%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hf_4.c.o
[ 79%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hf_5.c.o
[ 79%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hf_6.c.o
[ 79%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hf_64.c.o
[ 79%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hf_7.c.o
[ 79%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hf_8.c.o
[ 80%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/hf_9.c.o
[ 80%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/r2cfII_10.c.o
[ 80%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/r2cfII_12.c.o
[ 80%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/r2cfII_15.c.o
[ 81%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/r2cfII_16.c.o
[ 81%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/r2cfII_2.c.o
[ 81%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/r2cfII_20.c.o
[ 81%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/r2cfII_25.c.o
[ 81%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/r2cfII_3.c.o
[ 82%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/r2cfII_32.c.o
[ 82%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/r2cfII_4.c.o
[ 82%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/r2cfII_5.c.o
[ 82%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/r2cfII_6.c.o
[ 82%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/r2cfII_64.c.o
[ 83%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/r2cfII_7.c.o
[ 83%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/r2cfII_8.c.o
[ 83%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/r2cfII_9.c.o
[ 83%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/r2cf_10.c.o
[ 84%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/r2cf_11.c.o
[ 84%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/r2cf_12.c.o
[ 84%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/r2cf_128.c.o
[ 84%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/r2cf_13.c.o
[ 84%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/r2cf_14.c.o
[ 85%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/r2cf_15.c.o
[ 85%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/r2cf_16.c.o
[ 85%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/r2cf_2.c.o
[ 85%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/r2cf_20.c.o
[ 85%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/r2cf_25.c.o
[ 86%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/r2cf_3.c.o
[ 86%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/r2cf_32.c.o
[ 86%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/r2cf_4.c.o
[ 86%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/r2cf_5.c.o
[ 87%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/r2cf_6.c.o
[ 87%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/r2cf_64.c.o
[ 87%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/r2cf_7.c.o
[ 87%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/r2cf_8.c.o
[ 87%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2cf/r2cf_9.c.o
[ 88%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2r/codlist.c.o
[ 88%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2r/e01_8.c.o
[ 88%] Building C object CMakeFiles/fftw3.dir/rdft/scalar/r2r/e10_8.c.o
[ 88%] Building C object CMakeFiles/fftw3.dir/reodft/conf.c.o
[ 88%] Building C object CMakeFiles/fftw3.dir/reodft/redft00e-r2hc-pad.c.o
[ 89%] Building C object CMakeFiles/fftw3.dir/reodft/redft00e-r2hc.c.o
[ 89%] Building C object CMakeFiles/fftw3.dir/reodft/reodft010e-r2hc.c.o
[ 89%] Building C object CMakeFiles/fftw3.dir/reodft/reodft00e-splitradix.c.o
[ 89%] Building C object CMakeFiles/fftw3.dir/reodft/reodft11e-r2hc-odd.c.o
[ 90%] Building C object CMakeFiles/fftw3.dir/reodft/reodft11e-r2hc.c.o
[ 90%] Building C object CMakeFiles/fftw3.dir/reodft/reodft11e-radix2.c.o
[ 90%] Building C object CMakeFiles/fftw3.dir/reodft/rodft00e-r2hc-pad.c.o
[ 90%] Building C object CMakeFiles/fftw3.dir/reodft/rodft00e-r2hc.c.o
[ 90%] Building C object CMakeFiles/fftw3.dir/simd-support/altivec.c.o
[ 91%] Building C object CMakeFiles/fftw3.dir/simd-support/avx-128-fma.c.o
[ 91%] Building C object CMakeFiles/fftw3.dir/simd-support/avx.c.o
[ 91%] Building C object CMakeFiles/fftw3.dir/simd-support/avx2.c.o
[ 91%] Building C object CMakeFiles/fftw3.dir/simd-support/avx512.c.o
[ 91%] Building C object CMakeFiles/fftw3.dir/simd-support/kcvi.c.o
[ 92%] Building C object CMakeFiles/fftw3.dir/simd-support/neon.c.o
[ 92%] Building C object CMakeFiles/fftw3.dir/simd-support/sse2.c.o
[ 92%] Building C object CMakeFiles/fftw3.dir/simd-support/taint.c.o
[ 92%] Building C object CMakeFiles/fftw3.dir/simd-support/vsx.c.o
[ 93%] Linking C shared library libfftw3.so
[ 93%] Built target fftw3
Scanning dependencies of target bench
[ 93%] Building C object CMakeFiles/bench.dir/libbench2/after-ccopy-from.c.o
[ 93%] Building C object CMakeFiles/bench.dir/libbench2/after-ccopy-to.c.o
[ 93%] Building C object CMakeFiles/bench.dir/libbench2/after-hccopy-from.c.o
[ 93%] Building C object CMakeFiles/bench.dir/libbench2/after-hccopy-to.c.o
[ 93%] Building C object CMakeFiles/bench.dir/libbench2/after-rcopy-to.c.o
[ 94%] Building C object CMakeFiles/bench.dir/libbench2/allocate.c.o
[ 94%] Building C object CMakeFiles/bench.dir/libbench2/after-rcopy-from.c.o
[ 94%] Building C object CMakeFiles/bench.dir/libbench2/aset.c.o
[ 94%] Building C object CMakeFiles/bench.dir/libbench2/bench-cost-postprocess.c.o
[ 95%] Building C object CMakeFiles/bench.dir/libbench2/bench-exit.c.o
[ 95%] Building C object CMakeFiles/bench.dir/libbench2/bench-main.c.o
[ 95%] Building C object CMakeFiles/bench.dir/libbench2/can-do.c.o
[ 95%] Building C object CMakeFiles/bench.dir/libbench2/caset.c.o
[ 95%] Building C object CMakeFiles/bench.dir/libbench2/dotens2.c.o
[ 96%] Building C object CMakeFiles/bench.dir/libbench2/info.c.o
[ 96%] Building C object CMakeFiles/bench.dir/libbench2/main.c.o
[ 96%] Building C object CMakeFiles/bench.dir/libbench2/mflops.c.o
[ 96%] Building C object CMakeFiles/bench.dir/libbench2/mp.c.o
[ 97%] Building C object CMakeFiles/bench.dir/libbench2/my-getopt.c.o
[ 97%] Building C object CMakeFiles/bench.dir/libbench2/ovtpvt.c.o
[ 97%] Building C object CMakeFiles/bench.dir/libbench2/pow2.c.o
[ 97%] Building C object CMakeFiles/bench.dir/libbench2/problem.c.o
[ 97%] Building C object CMakeFiles/bench.dir/libbench2/report.c.o
[ 98%] Building C object CMakeFiles/bench.dir/libbench2/speed.c.o
[ 98%] Building C object CMakeFiles/bench.dir/libbench2/tensor.c.o
[ 98%] Building C object CMakeFiles/bench.dir/libbench2/timer.c.o
[ 98%] Building C object CMakeFiles/bench.dir/libbench2/util.c.o
[ 98%] Building C object CMakeFiles/bench.dir/libbench2/verify-dft.c.o
[ 99%] Building C object CMakeFiles/bench.dir/libbench2/verify-lib.c.o
[ 99%] Building C object CMakeFiles/bench.dir/libbench2/verify-rdft2.c.o
[ 99%] Building C object CMakeFiles/bench.dir/libbench2/verify-r2r.c.o
[ 99%] Building C object CMakeFiles/bench.dir/libbench2/verify.c.o
[100%] Building C object CMakeFiles/bench.dir/libbench2/zero.c.o
[100%] Building C object CMakeFiles/bench.dir/tests/bench.c.o
[100%] Building C object CMakeFiles/bench.dir/tests/hook.c.o
[100%] Building C object CMakeFiles/bench.dir/tests/fftw-bench.c.o
[100%] Linking C executable bench
[100%] Built target bench
Install the project...
-- Install configuration: "Debug"
-- Installing: /opt/fftw/lib/libfftw3.so.3
-- Installing: /opt/fftw/lib/libfftw3.so.3.5.7
-- Installing: /opt/fftw/lib/libfftw3.so
-- Up-to-date: /opt/fftw/lib/libfftw3.so.3
-- Up-to-date: /opt/fftw/lib/libfftw3.so.3.5.7
-- Up-to-date: /opt/fftw/lib/libfftw3.so
-- Up-to-date: /opt/fftw/include/fftw3.h
-- Up-to-date: /opt/fftw/include/fftw3.f
-- Up-to-date: /opt/fftw/include/fftw3l.f03
-- Up-to-date: /opt/fftw/include/fftw3q.f03
-- Installing: /opt/fftw/include/fftw3.f03
-- Installing: /opt/fftw/lib/pkgconfig/fftw.pc
-- Installing: /opt/fftw/lib/cmake/fftw3/FFTW3Config.cmake
-- Installing: /opt/fftw/lib/cmake/fftw3/FFTW3ConfigVersion.cmake
-- Installing: /opt/fftw/lib/cmake/fftw3/FFTW3LibraryDepends.cmake
-- Installing: /opt/fftw/lib/cmake/fftw3/FFTW3LibraryDepends-debug.cmake
 ---> Removed intermediate container b4a543a9e004
 ---> 7ee7bbc2f489
Step 21/25 : ARG HEFFTE_VERSION=2.3.0
 ---> Running in e69261ffadb9
 ---> Removed intermediate container e69261ffadb9
 ---> d7eff5989d95
Step 22/25 : ENV HEFFTE_DIR=/opt/heffte
 ---> Running in 48602ae8e1a3
 ---> Removed intermediate container 48602ae8e1a3
 ---> 1a80cfa715f7
Step 23/25 : RUN HEFFTE_URL=https://github.com/icl-utk-edu/heffte/archive/v${HEFFTE_VERSION}.tar.gz &&     HEFFTE_ARCHIVE=heffte.tar.gz &&     SCRATCH_DIR=/scratch && mkdir -p ${SCRATCH_DIR} && cd ${SCRATCH_DIR} &&     wget --quiet ${HEFFTE_URL} --output-document=${HEFFTE_ARCHIVE} &&     mkdir -p heffte &&     tar -xf ${HEFFTE_ARCHIVE} -C heffte --strip-components=1 &&     cd heffte &&     mkdir -p build && cd build &&     cmake       -D CMAKE_INSTALL_PREFIX=${HEFFTE_DIR}       -D CMAKE_PREFIX_PATH=${FFTW_DIR}       -D CMAKE_BUILD_TYPE=Debug       -D Heffte_ENABLE_CUDA=ON       -D Heffte_ENABLE_FFTW=ON     .. &&     make -j${NPROCS} install &&     rm -rf ${SCRATCH_DIR}
 ---> Running in 76731c057084
-- The CXX compiler identification is GNU 9.4.0
-- Check for working CXX compiler: /usr/bin/c++
-- Check for working CXX compiler: /usr/bin/c++ -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Could NOT find Git (missing: GIT_EXECUTABLE) 
-- Found MPI_CXX: /opt/openmpi/lib/libmpi.so (found version "3.1") 
-- Found MPI: TRUE (found version "3.1")  
-- Looking for C++ include pthread.h
-- Looking for C++ include pthread.h - found
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Failed
-- Looking for pthread_create in pthreads
-- Looking for pthread_create in pthreads - not found
-- Looking for pthread_create in pthread
-- Looking for pthread_create in pthread - found
-- Found Threads: TRUE  
-- Found CUDA: /usr/local/cuda (found version "11.0") 
-- Found OpenMP_CXX: -fopenmp (found version "4.5") 
-- Found OpenMP: TRUE (found version "4.5")  
-- Found HeffteFFTW: /opt/fftw/include  
-- 
-- heFFTe 2.3.0
--  -D CMAKE_INSTALL_PREFIX=/opt/heffte
--  -D BUILD_SHARED_LIBS=ON
--  -D CMAKE_BUILD_TYPE=Debug
--  -D CMAKE_CXX_FLAGS_DEBUG=-g
--  -D CMAKE_CXX_FLAGS=
--  -D MPI_CXX_COMPILER=/opt/openmpi/bin/mpicxx
--  -D MPI_CXX_COMPILE_OPTIONS=-pthread
--  -D CUDA_NVCC_FLAGS=-std=c++11
--  -D CUDA_TOOLKIT_ROOT_DIR=/usr/local/cuda
--  -D Heffte_ENABLE_FFTW=ON
--  -D Heffte_ENABLE_MKL=OFF
--  -D Heffte_ENABLE_CUDA=ON
--  -D Heffte_ENABLE_ROCM=OFF
--  -D Heffte_ENABLE_ONEAPI=OFF
--  -D Heffte_ENABLE_AVX=OFF
--  -D Heffte_ENABLE_AVX512=OFF
--  -D Heffte_ENABLE_PYTHON=OFF
--  -D Heffte_ENABLE_FORTRAN=OFF
--  -D Heffte_ENABLE_TRACING=OFF
-- 
-- Setting Heffte INSTALL_RPATH = /usr/local/cuda/lib64;/usr/lib/x86_64-linux-gnu;/opt/openmpi/lib;/opt/fftw/lib;/usr/lib/gcc/x86_64-linux-gnu/9
-- Configuring done
-- Generating done
-- Build files have been written to: /scratch/heffte/build
[  1%] Building NVCC (Device) object CMakeFiles/Heffte.dir/src/Heffte_generated_heffte_backend_cuda.cu.o
Scanning dependencies of target Heffte
[  6%] Building CXX object CMakeFiles/Heffte.dir/src/heffte_c.cpp.o
[  6%] Building CXX object CMakeFiles/Heffte.dir/src/heffte_magma_helpers.cpp.o
[  8%] Building CXX object CMakeFiles/Heffte.dir/src/heffte_reshape3d.cpp.o
[  8%] Building CXX object CMakeFiles/Heffte.dir/src/heffte_plan_logic.cpp.o
[ 10%] Building CXX object CMakeFiles/Heffte.dir/src/heffte_compute_transform.cpp.o
[ 11%] Linking CXX shared library libheffte.so
[ 11%] Built target Heffte
Scanning dependencies of target convolution
Scanning dependencies of target speed3d_r2r
Scanning dependencies of target speed3d_c2c
Scanning dependencies of target speed3d_r2c
[ 15%] Building CXX object benchmarks/CMakeFiles/speed3d_c2c.dir/speed3d_c2c.cpp.o
[ 15%] Building CXX object benchmarks/CMakeFiles/convolution.dir/convolution.cpp.o
[ 16%] Building CXX object benchmarks/CMakeFiles/speed3d_r2r.dir/speed3d_r2r.cpp.o
[ 18%] Building CXX object benchmarks/CMakeFiles/speed3d_r2c.dir/speed3d_r2c.cpp.o
[ 20%] Linking CXX executable convolution
[ 20%] Built target convolution
Scanning dependencies of target heffte_example_r2c
[ 22%] Building CXX object examples/CMakeFiles/heffte_example_r2c.dir/heffte_example_r2c.cpp.o
[ 23%] Linking CXX executable speed3d_c2c
[ 23%] Built target speed3d_c2c
Scanning dependencies of target heffte_example_options
[ 25%] Building CXX object examples/CMakeFiles/heffte_example_options.dir/heffte_example_options.cpp.o
[ 27%] Linking CXX executable speed3d_r2c
[ 27%] Built target speed3d_r2c
Scanning dependencies of target heffte_example_vectors
[ 28%] Building CXX object examples/CMakeFiles/heffte_example_vectors.dir/heffte_example_vectors.cpp.o
[ 30%] Linking CXX executable heffte_example_r2c
[ 32%] Linking CXX executable speed3d_r2r
[ 32%] Built target heffte_example_r2c
Scanning dependencies of target heffte_example_gpu
[ 33%] Building CXX object examples/CMakeFiles/heffte_example_gpu.dir/heffte_example_gpu.cpp.o
[ 33%] Built target speed3d_r2r
Scanning dependencies of target heffte_example_r2r
[ 35%] Building CXX object examples/CMakeFiles/heffte_example_r2r.dir/heffte_example_r2r.cpp.o
[ 37%] Linking CXX executable heffte_example_options
[ 37%] Built target heffte_example_options
Scanning dependencies of target heffte_example_fftw
[ 38%] Building CXX object examples/CMakeFiles/heffte_example_fftw.dir/heffte_example_fftw.cpp.o
[ 40%] Linking CXX executable heffte_example_vectors
[ 40%] Built target heffte_example_vectors
Scanning dependencies of target test_fft3d_np6
[ 42%] Building CXX object test/CMakeFiles/test_fft3d_np6.dir/test_fft3d_np6.cpp.o
[ 44%] Linking CXX executable heffte_example_gpu
[ 44%] Built target heffte_example_gpu
Scanning dependencies of target test_fft3d_np2
[ 45%] Building CXX object test/CMakeFiles/test_fft3d_np2.dir/test_fft3d_np2.cpp.o
[ 47%] Linking CXX executable heffte_example_r2r
[ 47%] Built target heffte_example_r2r
Scanning dependencies of target test_unit_nompi
[ 49%] Building CXX object test/CMakeFiles/test_unit_nompi.dir/test_units_nompi.cpp.o
[ 50%] Linking CXX executable heffte_example_fftw
[ 50%] Built target heffte_example_fftw
Scanning dependencies of target test_fft3d_np4
[ 52%] Building CXX object test/CMakeFiles/test_fft3d_np4.dir/test_fft3d_np4.cpp.o
[ 54%] Linking CXX executable test_fft3d_np6
[ 54%] Built target test_fft3d_np6
Scanning dependencies of target test_heffte_header
[ 55%] Building CXX object test/CMakeFiles/test_heffte_header.dir/test_heffte_header.cpp.o
[ 57%] Linking CXX executable test_fft3d_np2
[ 57%] Built target test_fft3d_np2
Scanning dependencies of target test_fft3d_np8
[ 59%] Linking CXX executable test_heffte_header
[ 61%] Building CXX object test/CMakeFiles/test_fft3d_np8.dir/test_fft3d_np8.cpp.o
[ 62%] Linking CXX executable test_fft3d_np4
[ 62%] Built target test_heffte_header
Scanning dependencies of target test_fft3d_np1
[ 64%] Building CXX object test/CMakeFiles/test_fft3d_np1.dir/test_fft3d_np1.cpp.o
[ 66%] Linking CXX executable test_unit_nompi
[ 66%] Built target test_fft3d_np4
Scanning dependencies of target test_unit_stock
[ 67%] Building CXX object test/CMakeFiles/test_unit_stock.dir/test_units_stock.cpp.o
[ 67%] Built target test_unit_nompi
Scanning dependencies of target test_fft3d_np12
[ 69%] Building CXX object test/CMakeFiles/test_fft3d_np12.dir/test_fft3d_np12.cpp.o
[ 71%] Linking CXX executable test_unit_stock
[ 71%] Built target test_unit_stock
Scanning dependencies of target sandbox
[ 72%] Linking CXX executable test_fft3d_np8
[ 74%] Building CXX object test/CMakeFiles/sandbox.dir/sandbox.cpp.o
[ 74%] Built target test_fft3d_np8
Scanning dependencies of target test_streams
[ 76%] Building CXX object test/CMakeFiles/test_streams.dir/test_streams.cpp.o
[ 77%] Linking CXX executable test_fft3d_np1
[ 77%] Built target test_fft3d_np1
Scanning dependencies of target test_reshape3d
[ 79%] Building CXX object test/CMakeFiles/test_reshape3d.dir/test_reshape3d.cpp.o
[ 81%] Linking CXX executable test_fft3d_np12
[ 81%] Built target test_fft3d_np12
Scanning dependencies of target test_subcomm
[ 83%] Building CXX object test/CMakeFiles/test_subcomm.dir/test_subcomm.cpp.o
[ 84%] Linking CXX executable sandbox
[ 84%] Built target sandbox
Scanning dependencies of target test_fft3d_r2c
[ 86%] Building CXX object test/CMakeFiles/test_fft3d_r2c.dir/test_fft3d_r2c.cpp.o
[ 88%] Linking CXX executable test_streams
[ 88%] Built target test_streams
Scanning dependencies of target test_cos_transform
[ 89%] Building CXX object test/CMakeFiles/test_cos_transform.dir/test_cos.cpp.o
[ 91%] Linking CXX executable test_subcomm
[ 91%] Built target test_subcomm
Scanning dependencies of target test_longlong
[ 93%] Building CXX object test/CMakeFiles/test_longlong.dir/test_longlong.cpp.o
[ 94%] Linking CXX executable test_fft3d_r2c
[ 96%] Linking CXX executable test_reshape3d
[ 96%] Built target test_fft3d_r2c
[ 96%] Built target test_reshape3d
[ 98%] Linking CXX executable test_cos_transform
[ 98%] Built target test_cos_transform
[100%] Linking CXX executable test_longlong
[100%] Built target test_longlong
Install the project...
-- Install configuration: "Debug"
-- Installing: /opt/heffte/lib/libheffte.so.2.3.0
-- Installing: /opt/heffte/lib/libheffte.so.2
-- Set runtime path of "/opt/heffte/lib/libheffte.so.2.3.0" to "/usr/local/cuda/lib64:/usr/lib/x86_64-linux-gnu:/opt/openmpi/lib:/opt/fftw/lib:/usr/lib/gcc/x86_64-linux-gnu/9"
-- Installing: /opt/heffte/lib/libheffte.so
-- Installing: /opt/heffte/lib/cmake/Heffte/HeffteTargets.cmake
-- Installing: /opt/heffte/lib/cmake/Heffte/HeffteTargets-debug.cmake
-- Installing: /opt/heffte/lib/cmake/Heffte/HeffteConfig.cmake
-- Installing: /opt/heffte/lib/cmake/Heffte/HeffteConfigVersion.cmake
-- Installing: /opt/heffte/include
-- Installing: /opt/heffte/include/heffte_backend_cuda.h
-- Installing: /opt/heffte/include/heffte_compute_transform.h
-- Installing: /opt/heffte/include/heffte_magma_helpers.h
-- Installing: /opt/heffte/include/heffte_c_defines.h
-- Installing: /opt/heffte/include/heffte_backend_mkl.h
-- Installing: /opt/heffte/include/heffte_backend_rocm.h
-- Installing: /opt/heffte/include/heffte_backend_fftw.h
-- Installing: /opt/heffte/include/heffte_fft3d.h
-- Installing: /opt/heffte/include/heffte_geometry.h
-- Installing: /opt/heffte/include/heffte_backend_stock.h
-- Installing: /opt/heffte/include/heffte_backends.h
-- Installing: /opt/heffte/include/heffte_backend_oneapi.h
-- Installing: /opt/heffte/include/heffte_r2r_executor.h
-- Installing: /opt/heffte/include/heffte_pack3d.h
-- Installing: /opt/heffte/include/heffte_trace.h
-- Installing: /opt/heffte/include/heffte_plan_logic.h
-- Installing: /opt/heffte/include/heffte_fft3d_r2c.h
-- Installing: /opt/heffte/include/heffte_backend_data_transfer.h
-- Installing: /opt/heffte/include/heffte_common.h
-- Installing: /opt/heffte/include/heffte_backend_vector.h
-- Installing: /opt/heffte/include/heffte_utils.h
-- Installing: /opt/heffte/include/heffte_c.h
-- Installing: /opt/heffte/include/heffte_reshape3d.h
-- Installing: /opt/heffte/include/heffte_config.cmake.h
-- Installing: /opt/heffte/include/heffte.h
-- Installing: /opt/heffte/include/stock_fft
-- Installing: /opt/heffte/include/stock_fft/heffte_stock_allocator.h
-- Installing: /opt/heffte/include/stock_fft/heffte_stock_complex.h
-- Installing: /opt/heffte/include/stock_fft/heffte_stock_algos.h
-- Installing: /opt/heffte/include/stock_fft/heffte_stock_vec_types.h
-- Installing: /opt/heffte/include/stock_fft/heffte_stock_tree.h
-- Installing: /opt/heffte/include/heffte_config.h
-- Installing: /opt/heffte/share/heffte/testing/CMakeLists.txt
-- Installing: /opt/heffte/share/heffte/examples/CMakeLists.txt
-- Up-to-date: /opt/heffte/share/heffte/examples
-- Installing: /opt/heffte/share/heffte/examples/heffte_example_gpu.cpp
-- Installing: /opt/heffte/share/heffte/examples/heffte_example_vectors.cpp
-- Installing: /opt/heffte/share/heffte/examples/heffte_example_fftw.f90
-- Installing: /opt/heffte/share/heffte/examples/heffte_example_fftw.cpp
-- Installing: /opt/heffte/share/heffte/examples/heffte_example_r2r.cpp
-- Installing: /opt/heffte/share/heffte/examples/heffte_example_sycl.cpp
-- Installing: /opt/heffte/share/heffte/examples/heffte_example_c.c
-- Installing: /opt/heffte/share/heffte/examples/heffte_example_options.cpp
-- Installing: /opt/heffte/share/heffte/examples/heffte_example_r2c.cpp
 ---> Removed intermediate container 76731c057084
 ---> a2ee2d5df174
Step 24/25 : ENV HYPRE_DIR=/opt/hypre
 ---> Running in d03a6211705f
Selecting previously unselected package intel-oneapi-libdpstd-devel-2022.0.0.
Preparing to unpack .../28-intel-oneapi-libdpstd-devel-2022.0.0_2022.0.0-25335_amd64.deb ...
Unpacking intel-oneapi-libdpstd-devel-2022.0.0 (2022.0.0-25335) ...
Selecting previously unselected package intel-oneapi-compiler-dpcpp-cpp-2023.0.0.
Preparing to unpack .../29-intel-oneapi-compiler-dpcpp-cpp-2023.0.0_2023.0.0-25370_amd64.deb ...
Unpacking intel-oneapi-compiler-dpcpp-cpp-2023.0.0 (2023.0.0-25370) ...
Setting up intel-oneapi-common-licensing-2023.1.0 (2023.1.0-43473) ...
Setting up intel-oneapi-common-oneapi-vars-2024.0 (2024.0.0-49406) ...
Setting up intel-oneapi-common-licensing-2023.0.0 (2023.0.0-25325) ...
Setting up intel-oneapi-icc-eclipse-plugin-cpp-2023.0.0 (2023.0.0-25370) ...
Setting up intel-oneapi-common-licensing-2024.0 (2024.0.0-49406) ...
Setting up intel-oneapi-condaindex (2023.2.0-49417) ...
Setting up intel-oneapi-common-licensing-2023.2.0 (2023.2.0-49462) ...
Setting up intel-oneapi-common-vars (2024.0.0-49406) ...
Setting up intel-oneapi-dpcpp-debugger-eclipse-cfg (2023.1.0-43513) ...
Setting up intel-oneapi-compiler-shared-common-2023.0.0 (2023.0.0-25370) ...
Setting up intel-oneapi-compiler-cpp-eclipse-cfg-2024.0 (2024.0.2-49895) ...
Setting up intel-oneapi-openmp-common-2023.0.0 (2023.0.0-25370) ...
Setting up intel-oneapi-compiler-dpcpp-eclipse-cfg-2024.0 (2024.0.2-49895) ...
Setting up intel-oneapi-dev-utilities-eclipse-cfg (2021.10.0-49423) ...
Setting up intel-oneapi-libdpstd-devel-2022.0.0 (2022.0.0-25335) ...
 ---> Removed intermediate container d03a6211705f
 ---> 4b14b4bced47
Step 25/25 : RUN HYPRE_VERSION=v2.22.1 &&     HYPRE_URL=https://github.com/hypre-space/hypre/archive/${HYPRE_VERSION}.tar.gz &&     HYPRE_ARCHIVE=hypre.tar.gz &&     wget --quiet ${HYPRE_URL} --output-document=${HYPRE_ARCHIVE} &&     mkdir hypre &&     tar -xf ${HYPRE_ARCHIVE} -C hypre --strip-components=1 &&     cd hypre &&     mkdir -p build && cd build &&     cmake       -D CMAKE_INSTALL_PREFIX=${HYPRE_DIR}       -D CMAKE_BUILD_TYPE=Debug       -D HYPRE_WITH_CUDA=ON       -D HYPRE_WITH_MPI=ON     ../src &&     make -j${NPROCS} install &&     cd ../.. && rm -r hypre
 ---> Running in 3bea040edc42
-- The C compiler identification is GNU 9.4.0
-- Check for working C compiler: /usr/bin/cc
-- Check for working C compiler: /usr/bin/cc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Detecting C compile features
-- Detecting C compile features - done
-- The CXX compiler identification is GNU 9.4.0
-- Check for working CXX compiler: /usr/bin/c++
-- Check for working CXX compiler: /usr/bin/c++ -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Enabled support for CXX.
-- Using CXX standard: c++11
-- Looking for a CUDA compiler
-- Looking for a CUDA compiler - /usr/local/cuda/bin/nvcc
-- The CUDA compiler identification is NVIDIA 11.0.221
-- Check for working CUDA compiler: /usr/local/cuda/bin/nvcc
-- Check for working CUDA compiler: /usr/local/cuda/bin/nvcc -- works
-- Detecting CUDA compiler ABI info
-- Detecting CUDA compiler ABI info - done
-- Enabled support for CUDA.
-- Using CUDA architecture: 70
-- Looking for pthread.h
-- Looking for pthread.h - found
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Failed
-- Looking for pthread_create in pthreads
-- Looking for pthread_create in pthreads - not found
-- Looking for pthread_create in pthread
-- Looking for pthread_create in pthread - found
-- Found Threads: TRUE  
-- Found CUDA: /usr/local/cuda (found version "11.0") 
-- Found MPI_C: /opt/openmpi/lib/libmpi.so (found version "3.1") 
-- Found MPI_CXX: /opt/openmpi/lib/libmpi.so (found version "3.1") 
-- Found MPI: TRUE (found version "3.1")  
-- Configuring done
-- Generating done
-- Build files have been written to: /hypre/build
Scanning dependencies of target HYPRE
[  0%] Building C object CMakeFiles/HYPRE.dir/blas/dasum.c.o
[  0%] Building C object CMakeFiles/HYPRE.dir/blas/ddot.c.o
[  0%] Building C object CMakeFiles/HYPRE.dir/blas/dcopy.c.o
[  0%] Building C object CMakeFiles/HYPRE.dir/blas/daxpy.c.o
[  0%] Building C object CMakeFiles/HYPRE.dir/blas/dgemm.c.o
[  1%] Building C object CMakeFiles/HYPRE.dir/blas/dgemv.c.o
[  1%] Building C object CMakeFiles/HYPRE.dir/blas/dger.c.o
[  1%] Building C object CMakeFiles/HYPRE.dir/blas/dnrm2.c.o
[  1%] Building C object CMakeFiles/HYPRE.dir/blas/drot.c.o
[  1%] Building C object CMakeFiles/HYPRE.dir/blas/dscal.c.o
[  2%] Building C object CMakeFiles/HYPRE.dir/blas/dsymm.c.o
[  2%] Building C object CMakeFiles/HYPRE.dir/blas/dswap.c.o
[  2%] Building C object CMakeFiles/HYPRE.dir/blas/dsymv.c.o
[  2%] Building C object CMakeFiles/HYPRE.dir/blas/dsyr2.c.o
[  2%] Building C object CMakeFiles/HYPRE.dir/blas/dsyr2k.c.o
[  2%] Building C object CMakeFiles/HYPRE.dir/blas/dsyrk.c.o
[  2%] Building C object CMakeFiles/HYPRE.dir/blas/dtrmm.c.o
[  3%] Building C object CMakeFiles/HYPRE.dir/blas/dtrmv.c.o
[  3%] Building C object CMakeFiles/HYPRE.dir/blas/dtrsm.c.o
[  3%] Building C object CMakeFiles/HYPRE.dir/blas/dtrsv.c.o
[  3%] Building C object CMakeFiles/HYPRE.dir/blas/f2c.c.o
[  3%] Building C object CMakeFiles/HYPRE.dir/blas/idamax.c.o
[  3%] Building C object CMakeFiles/HYPRE.dir/blas/lsame.c.o
[  4%] Building C object CMakeFiles/HYPRE.dir/blas/xerbla.c.o
[  4%] Building C object CMakeFiles/HYPRE.dir/lapack/dbdsqr.c.o
[  4%] Building C object CMakeFiles/HYPRE.dir/lapack/dgebd2.c.o
[  4%] Building C object CMakeFiles/HYPRE.dir/lapack/dgebrd.c.o
[  4%] Building C object CMakeFiles/HYPRE.dir/lapack/dgelq2.c.o
[  5%] Building C object CMakeFiles/HYPRE.dir/lapack/dgels.c.o
[  5%] Building C object CMakeFiles/HYPRE.dir/lapack/dgelqf.c.o
[  5%] Building C object CMakeFiles/HYPRE.dir/lapack/dgeqr2.c.o
[  5%] Building C object CMakeFiles/HYPRE.dir/lapack/dgeqrf.c.o
[  5%] Building C object CMakeFiles/HYPRE.dir/lapack/dgesvd.c.o
[  5%] Building C object CMakeFiles/HYPRE.dir/lapack/dgetrf.c.o
[  6%] Building C object CMakeFiles/HYPRE.dir/lapack/dgetri.c.o
[  6%] Building C object CMakeFiles/HYPRE.dir/lapack/dgetrs.c.o
[  6%] Building C object CMakeFiles/HYPRE.dir/lapack/dgetf2.c.o
[  6%] Building C object CMakeFiles/HYPRE.dir/lapack/dlabad.c.o
[  6%] Building C object CMakeFiles/HYPRE.dir/lapack/dlabrd.c.o
[  6%] Building C object CMakeFiles/HYPRE.dir/lapack/dlacpy.c.o
[  7%] Building C object CMakeFiles/HYPRE.dir/lapack/dlae2.c.o
[  7%] Building C object CMakeFiles/HYPRE.dir/lapack/dlaev2.c.o
[  7%] Building C object CMakeFiles/HYPRE.dir/lapack/dlamch.c.o
[  7%] Building C object CMakeFiles/HYPRE.dir/lapack/dlange.c.o
[  7%] Building C object CMakeFiles/HYPRE.dir/lapack/dlanst.c.o
[  7%] Building C object CMakeFiles/HYPRE.dir/lapack/dlansy.c.o
[  8%] Building C object CMakeFiles/HYPRE.dir/lapack/dlapy2.c.o
[  8%] Building C object CMakeFiles/HYPRE.dir/lapack/dlarfb.c.o
[  8%] Building C object CMakeFiles/HYPRE.dir/lapack/dlarf.c.o
[  8%] Building C object CMakeFiles/HYPRE.dir/lapack/dlarft.c.o
[  8%] Building C object CMakeFiles/HYPRE.dir/lapack/dlarfg.c.o
[  8%] Building C object CMakeFiles/HYPRE.dir/lapack/dlartg.c.o
[  9%] Building C object CMakeFiles/HYPRE.dir/lapack/dlas2.c.o
[  9%] Building C object CMakeFiles/HYPRE.dir/lapack/dlascl.c.o
[  9%] Building C object CMakeFiles/HYPRE.dir/lapack/dlaset.c.o
[  9%] Building C object CMakeFiles/HYPRE.dir/lapack/dlasq2.c.o
[  9%] Building C object CMakeFiles/HYPRE.dir/lapack/dlasq1.c.o
[ 10%] Building C object CMakeFiles/HYPRE.dir/lapack/dlasq3.c.o
[ 10%] Building C object CMakeFiles/HYPRE.dir/lapack/dlasq4.c.o
[ 10%] Building C object CMakeFiles/HYPRE.dir/lapack/dlasq5.c.o
[ 10%] Building C object CMakeFiles/HYPRE.dir/lapack/dlasq6.c.o
[ 10%] Building C object CMakeFiles/HYPRE.dir/lapack/dlasr.c.o
[ 10%] Building C object CMakeFiles/HYPRE.dir/lapack/dlasrt.c.o
[ 11%] Building C object CMakeFiles/HYPRE.dir/lapack/dlassq.c.o
[ 11%] Building C object CMakeFiles/HYPRE.dir/lapack/dlaswp.c.o
[ 11%] Building C object CMakeFiles/HYPRE.dir/lapack/dlasv2.c.o
[ 11%] Building C object CMakeFiles/HYPRE.dir/lapack/dlatrd.c.o
[ 11%] Building C object CMakeFiles/HYPRE.dir/lapack/dorg2l.c.o
[ 11%] Building C object CMakeFiles/HYPRE.dir/lapack/dorg2r.c.o
[ 12%] Building C object CMakeFiles/HYPRE.dir/lapack/dorgbr.c.o
[ 12%] Building C object CMakeFiles/HYPRE.dir/lapack/dorgl2.c.o
[ 12%] Building C object CMakeFiles/HYPRE.dir/lapack/dorglq.c.o
[ 12%] Building C object CMakeFiles/HYPRE.dir/lapack/dorgql.c.o
[ 12%] Building C object CMakeFiles/HYPRE.dir/lapack/dorgqr.c.o
[ 12%] Building C object CMakeFiles/HYPRE.dir/lapack/dorgtr.c.o
[ 13%] Building C object CMakeFiles/HYPRE.dir/lapack/dorm2r.c.o
[ 13%] Building C object CMakeFiles/HYPRE.dir/lapack/dormbr.c.o
[ 13%] Building C object CMakeFiles/HYPRE.dir/lapack/dorml2.c.o
[ 13%] Building C object CMakeFiles/HYPRE.dir/lapack/dormlq.c.o
[ 13%] Building C object CMakeFiles/HYPRE.dir/lapack/dormqr.c.o
[ 14%] Building C object CMakeFiles/HYPRE.dir/lapack/dpotf2.c.o
[ 14%] Building C object CMakeFiles/HYPRE.dir/lapack/dpotrf.c.o
[ 14%] Building C object CMakeFiles/HYPRE.dir/lapack/dpotrs.c.o
[ 14%] Building C object CMakeFiles/HYPRE.dir/lapack/dsteqr.c.o
[ 14%] Building C object CMakeFiles/HYPRE.dir/lapack/dsterf.c.o
[ 14%] Building C object CMakeFiles/HYPRE.dir/lapack/dsyev.c.o
[ 15%] Building C object CMakeFiles/HYPRE.dir/lapack/dsygs2.c.o
[ 15%] Building C object CMakeFiles/HYPRE.dir/lapack/dsygst.c.o
[ 15%] Building C object CMakeFiles/HYPRE.dir/lapack/dsygv.c.o
[ 15%] Building C object CMakeFiles/HYPRE.dir/lapack/dsytd2.c.o
[ 15%] Building C object CMakeFiles/HYPRE.dir/lapack/dtrti2.c.o
[ 15%] Building C object CMakeFiles/HYPRE.dir/lapack/dsytrd.c.o
[ 16%] Building C object CMakeFiles/HYPRE.dir/lapack/dtrtri.c.o
[ 16%] Building C object CMakeFiles/HYPRE.dir/lapack/ieeeck.c.o
[ 16%] Building C object CMakeFiles/HYPRE.dir/lapack/ilaenv.c.o
[ 16%] Building C object CMakeFiles/HYPRE.dir/lapack/lsame.c.o
[ 16%] Building C object CMakeFiles/HYPRE.dir/lapack/xerbla.c.o
[ 16%] Building CUDA object CMakeFiles/HYPRE.dir/utilities/HYPRE_handle.c.o
[ 17%] Building C object CMakeFiles/HYPRE.dir/utilities/amg_linklist.c.o
[ 17%] Building C object CMakeFiles/HYPRE.dir/utilities/HYPRE_version.c.o
[ 17%] Building C object CMakeFiles/HYPRE.dir/utilities/binsearch.c.o
[ 17%] Building C object CMakeFiles/HYPRE.dir/utilities/exchange_data.c.o
[ 17%] Building C object CMakeFiles/HYPRE.dir/utilities/F90_HYPRE_error.c.o
[ 17%] Building C object CMakeFiles/HYPRE.dir/utilities/F90_HYPRE_general.c.o
[ 18%] Building C object CMakeFiles/HYPRE.dir/utilities/fortran_matrix.c.o
[ 18%] Building C object CMakeFiles/HYPRE.dir/utilities/ap.c.o
[ 18%] Building C object CMakeFiles/HYPRE.dir/utilities/log.c.o
[ 18%] Building C object CMakeFiles/HYPRE.dir/utilities/complex.c.o
[ 18%] Building CUDA object CMakeFiles/HYPRE.dir/utilities/cuda_utils.c.o
[ 19%] Building C object CMakeFiles/HYPRE.dir/utilities/error.c.o
[ 19%] Building CUDA object CMakeFiles/HYPRE.dir/utilities/general.c.o
[ 19%] Building CUDA object CMakeFiles/HYPRE.dir/utilities/handle.c.o
[ 19%] Building C object CMakeFiles/HYPRE.dir/utilities/hopscotch_hash.c.o
[ 19%] Building CUDA object CMakeFiles/HYPRE.dir/utilities/memory.c.o
[ 19%] Building C object CMakeFiles/HYPRE.dir/utilities/merge_sort.c.o
[ 20%] Building C object CMakeFiles/HYPRE.dir/utilities/mpi_comm_f2c.c.o
[ 20%] Building CUDA object CMakeFiles/HYPRE.dir/utilities/nvtx.c.o
[ 20%] Building CUDA object CMakeFiles/HYPRE.dir/utilities/omp_device.c.o
[ 20%] Building C object CMakeFiles/HYPRE.dir/utilities/prefix_sum.c.o
[ 20%] Building C object CMakeFiles/HYPRE.dir/utilities/printf.c.o
[ 20%] Building C object CMakeFiles/HYPRE.dir/utilities/qsort.c.o
[ 21%] Building C object CMakeFiles/HYPRE.dir/utilities/utilities.c.o
[ 21%] Building C object CMakeFiles/HYPRE.dir/utilities/mpistubs.c.o
[ 21%] Building C object CMakeFiles/HYPRE.dir/utilities/qsplit.c.o
[ 21%] Building C object CMakeFiles/HYPRE.dir/utilities/random.c.o
[ 21%] Building C object CMakeFiles/HYPRE.dir/utilities/threading.c.o
[ 21%] Building C object CMakeFiles/HYPRE.dir/utilities/timer.c.o
[ 22%] Building C object CMakeFiles/HYPRE.dir/utilities/timing.c.o
[ 22%] Building C object CMakeFiles/HYPRE.dir/multivector/multivector.c.o
[ 22%] Building C object CMakeFiles/HYPRE.dir/multivector/temp_multivector.c.o
[ 22%] Building C object CMakeFiles/HYPRE.dir/krylov/bicgstab.c.o
[ 22%] Building C object CMakeFiles/HYPRE.dir/krylov/cgnr.c.o
[ 23%] Building C object CMakeFiles/HYPRE.dir/krylov/cogmres.c.o
[ 23%] Building C object CMakeFiles/HYPRE.dir/krylov/gmres.c.o
[ 23%] Building C object CMakeFiles/HYPRE.dir/krylov/flexgmres.c.o
[ 23%] Building C object CMakeFiles/HYPRE.dir/krylov/lgmres.c.o
[ 23%] Building C object CMakeFiles/HYPRE.dir/krylov/HYPRE_bicgstab.c.o
[ 23%] Building C object CMakeFiles/HYPRE.dir/krylov/HYPRE_cgnr.c.o
[ 24%] Building C object CMakeFiles/HYPRE.dir/krylov/HYPRE_gmres.c.o
[ 24%] Building C object CMakeFiles/HYPRE.dir/krylov/HYPRE_cogmres.c.o
[ 24%] Building C object CMakeFiles/HYPRE.dir/krylov/HYPRE_lgmres.c.o
[ 24%] Building C object CMakeFiles/HYPRE.dir/krylov/HYPRE_flexgmres.c.o
[ 24%] Building C object CMakeFiles/HYPRE.dir/krylov/HYPRE_pcg.c.o
[ 24%] Building C object CMakeFiles/HYPRE.dir/krylov/pcg.c.o
[ 25%] Building C object CMakeFiles/HYPRE.dir/krylov/HYPRE_lobpcg.c.o
[ 25%] Building C object CMakeFiles/HYPRE.dir/krylov/lobpcg.c.o
[ 25%] Building C object CMakeFiles/HYPRE.dir/seq_mv/csr_matop.c.o
[ 25%] Building CUDA object CMakeFiles/HYPRE.dir/seq_mv/csr_matop_device.c.o
[ 25%] Building C object CMakeFiles/HYPRE.dir/seq_mv/csr_matrix.c.o
[ 25%] Building CUDA object CMakeFiles/HYPRE.dir/seq_mv/csr_matrix_cuda_utils.c.o
[ 26%] Building C object CMakeFiles/HYPRE.dir/seq_mv/csr_matvec.c.o
[ 26%] Building CUDA object CMakeFiles/HYPRE.dir/seq_mv/csr_matvec_device.c.o
[ 26%] Building CUDA object CMakeFiles/HYPRE.dir/seq_mv/csr_matvec_oomp.c.o
[ 26%] Building CUDA object CMakeFiles/HYPRE.dir/seq_mv/csr_spadd_device.c.o
[ 26%] Building CUDA object CMakeFiles/HYPRE.dir/seq_mv/csr_spgemm_device.c.o
[ 26%] Building CUDA object CMakeFiles/HYPRE.dir/seq_mv/csr_spgemm_device_attempt.c.o
[ 27%] Building CUDA object CMakeFiles/HYPRE.dir/seq_mv/csr_spgemm_device_confident.c.o
[ 27%] Building CUDA object CMakeFiles/HYPRE.dir/seq_mv/csr_spgemm_device_cusparse.c.o
[ 27%] Building CUDA object CMakeFiles/HYPRE.dir/seq_mv/csr_spgemm_device_rocsparse.c.o
[ 27%] Building CUDA object CMakeFiles/HYPRE.dir/seq_mv/csr_spgemm_device_rowbound.c.o
[ 27%] Building CUDA object CMakeFiles/HYPRE.dir/seq_mv/csr_spgemm_device_rowest.c.o
[ 28%] Building CUDA object CMakeFiles/HYPRE.dir/seq_mv/csr_spgemm_device_util.c.o
[ 28%] Building CUDA object CMakeFiles/HYPRE.dir/seq_mv/csr_spmv_device.c.o
[ 28%] Building CUDA object CMakeFiles/HYPRE.dir/seq_mv/csr_sptrans_device.c.o
[ 28%] Building C object CMakeFiles/HYPRE.dir/seq_mv/genpart.c.o
[ 28%] Building C object CMakeFiles/HYPRE.dir/seq_mv/HYPRE_csr_matrix.c.o
[ 28%] Building C object CMakeFiles/HYPRE.dir/seq_mv/HYPRE_mapped_matrix.c.o
[ 29%] Building C object CMakeFiles/HYPRE.dir/seq_mv/HYPRE_multiblock_matrix.c.o
[ 29%] Building C object CMakeFiles/HYPRE.dir/seq_mv/HYPRE_vector.c.o
[ 29%] Building C object CMakeFiles/HYPRE.dir/seq_mv/mapped_matrix.c.o
[ 29%] Building C object CMakeFiles/HYPRE.dir/seq_mv/multiblock_matrix.c.o
[ 29%] Building CUDA object CMakeFiles/HYPRE.dir/seq_mv/vector.c.o
[ 29%] Building C object CMakeFiles/HYPRE.dir/seq_mv/vector_batched.c.o
[ 30%] Building C object CMakeFiles/HYPRE.dir/parcsr_mv/communicationT.c.o
[ 30%] Building C object CMakeFiles/HYPRE.dir/parcsr_mv/F90_HYPRE_parcsr_matrix.c.o
[ 30%] Building C object CMakeFiles/HYPRE.dir/parcsr_mv/F90_HYPRE_parcsr_vector.c.o
[ 30%] Building C object CMakeFiles/HYPRE.dir/parcsr_mv/F90_parcsr_matrix.c.o
[ 30%] Building C object CMakeFiles/HYPRE.dir/parcsr_mv/F90_par_vector.c.o
[ 30%] Building C object CMakeFiles/HYPRE.dir/parcsr_mv/gen_fffc.c.o
[ 31%] Building C object CMakeFiles/HYPRE.dir/parcsr_mv/HYPRE_parcsr_matrix.c.o
[ 31%] Building C object CMakeFiles/HYPRE.dir/parcsr_mv/HYPRE_parcsr_vector.c.o
[ 31%] Building C object CMakeFiles/HYPRE.dir/parcsr_mv/new_commpkg.c.o
[ 31%] Building C object CMakeFiles/HYPRE.dir/parcsr_mv/numbers.c.o
[ 31%] Building C object CMakeFiles/HYPRE.dir/parcsr_mv/par_csr_aat.c.o
[ 32%] Building C object CMakeFiles/HYPRE.dir/parcsr_mv/par_csr_assumed_part.c.o
[ 32%] Building C object CMakeFiles/HYPRE.dir/parcsr_mv/par_csr_bool_matop.c.o
[ 32%] Building C object CMakeFiles/HYPRE.dir/parcsr_mv/par_csr_bool_matrix.c.o
[ 32%] Building C object CMakeFiles/HYPRE.dir/parcsr_mv/par_csr_communication.c.o
[ 32%] Building C object CMakeFiles/HYPRE.dir/parcsr_mv/par_csr_matop.c.o
[ 32%] Building C object CMakeFiles/HYPRE.dir/parcsr_mv/par_csr_matrix.c.o
[ 33%] Building C object CMakeFiles/HYPRE.dir/parcsr_mv/par_csr_matop_marked.c.o
[ 33%] Building CUDA object CMakeFiles/HYPRE.dir/parcsr_mv/par_csr_matvec.c.o
[ 33%] Building C object CMakeFiles/HYPRE.dir/parcsr_mv/par_vector.c.o
[ 33%] Building C object CMakeFiles/HYPRE.dir/parcsr_mv/par_vector_batched.c.o
[ 33%] Building C object CMakeFiles/HYPRE.dir/parcsr_mv/par_make_system.c.o
[ 33%] Building C object CMakeFiles/HYPRE.dir/parcsr_mv/par_csr_triplemat.c.o
[ 34%] Building CUDA object CMakeFiles/HYPRE.dir/parcsr_mv/par_csr_fffc_device.c.o
[ 34%] Building CUDA object CMakeFiles/HYPRE.dir/parcsr_mv/par_csr_matop_device.c.o
[ 34%] Building CUDA object CMakeFiles/HYPRE.dir/parcsr_mv/par_csr_triplemat_device.c.o
[ 34%] Building CUDA object CMakeFiles/HYPRE.dir/parcsr_mv/par_vector_device.c.o
[ 34%] Building C object CMakeFiles/HYPRE.dir/parcsr_block_mv/csr_block_matrix.c.o
[ 34%] Building C object CMakeFiles/HYPRE.dir/parcsr_block_mv/csr_block_matvec.c.o
[ 35%] Building C object CMakeFiles/HYPRE.dir/parcsr_block_mv/par_csr_block_matrix.c.o
[ 35%] Building C object CMakeFiles/HYPRE.dir/parcsr_block_mv/par_csr_block_matvec.c.o
[ 35%] Building C object CMakeFiles/HYPRE.dir/parcsr_block_mv/par_csr_block_comm.c.o
[ 35%] Building C object CMakeFiles/HYPRE.dir/parcsr_block_mv/par_csr_block_rap.c.o
[ 35%] Building C object CMakeFiles/HYPRE.dir/parcsr_block_mv/par_csr_block_rap_communication.c.o
[ 35%] Building C object CMakeFiles/HYPRE.dir/parcsr_block_mv/par_csr_block_interp.c.o
[ 36%] Building C object CMakeFiles/HYPRE.dir/parcsr_block_mv/par_csr_block_relax.c.o
[ 36%] Building C object CMakeFiles/HYPRE.dir/parcsr_block_mv/par_block_nodal_systems.c.o
[ 36%] Building C object CMakeFiles/HYPRE.dir/distributed_matrix/distributed_matrix.c.o
[ 36%] Building C object CMakeFiles/HYPRE.dir/distributed_matrix/distributed_matrix_ISIS.c.o
[ 36%] Building C object CMakeFiles/HYPRE.dir/distributed_matrix/distributed_matrix_parcsr.c.o
[ 37%] Building C object CMakeFiles/HYPRE.dir/distributed_matrix/distributed_matrix_PETSc.c.o
[ 37%] Building C object CMakeFiles/HYPRE.dir/distributed_matrix/HYPRE_distributed_matrix.c.o
[ 37%] Building C object CMakeFiles/HYPRE.dir/IJ_mv/aux_parcsr_matrix.c.o
[ 37%] Building C object CMakeFiles/HYPRE.dir/IJ_mv/aux_par_vector.c.o
[ 37%] Building C object CMakeFiles/HYPRE.dir/IJ_mv/F90_HYPRE_IJMatrix.c.o
[ 37%] Building C object CMakeFiles/HYPRE.dir/IJ_mv/F90_HYPRE_IJVector.c.o
[ 38%] Building C object CMakeFiles/HYPRE.dir/IJ_mv/F90_IJMatrix.c.o
[ 38%] Building C object CMakeFiles/HYPRE.dir/IJ_mv/HYPRE_IJMatrix.c.o
[ 38%] Building C object CMakeFiles/HYPRE.dir/IJ_mv/HYPRE_IJVector.c.o
[ 38%] Building C object CMakeFiles/HYPRE.dir/IJ_mv/IJ_assumed_part.c.o
[ 38%] Building C object CMakeFiles/HYPRE.dir/IJ_mv/IJMatrix.c.o
[ 38%] Building C object CMakeFiles/HYPRE.dir/IJ_mv/IJMatrix_parcsr.c.o
[ 39%] Building C object CMakeFiles/HYPRE.dir/IJ_mv/IJVector.c.o
[ 39%] Building C object CMakeFiles/HYPRE.dir/IJ_mv/IJVector_parcsr.c.o
[ 39%] Building CUDA object CMakeFiles/HYPRE.dir/IJ_mv/IJMatrix_parcsr_device.c.o
[ 39%] Building CUDA object CMakeFiles/HYPRE.dir/IJ_mv/IJVector_parcsr_device.c.o
[ 39%] Building C object CMakeFiles/HYPRE.dir/matrix_matrix/HYPRE_ConvertParCSRMatrixToDistributedMatrix.c.o
[ 39%] Building C object CMakeFiles/HYPRE.dir/matrix_matrix/HYPRE_ConvertPETScMatrixToDistributedMatrix.c.o
[ 40%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/Euclid/blas_dh.c.o
[ 40%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/Euclid/Euclid_apply.c.o
[ 40%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/Euclid/Euclid_dh.c.o
[ 40%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/Euclid/ExternalRows_dh.c.o
[ 40%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/Euclid/Factor_dh.c.o
[ 41%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/Euclid/getRow_dh.c.o
[ 41%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/Euclid/globalObjects.c.o
[ 41%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/Euclid/Hash_dh.c.o
[ 41%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/Euclid/Hash_i_dh.c.o
[ 41%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/Euclid/ilu_mpi_bj.c.o
[ 41%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/Euclid/ilu_mpi_pilu.c.o
[ 42%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/Euclid/ilu_seq.c.o
[ 42%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/Euclid/io_dh.c.o
[ 42%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/Euclid/krylov_dh.c.o
[ 42%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/Euclid/Mat_dh.c.o
[ 42%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/Euclid/mat_dh_private.c.o
[ 42%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/Euclid/MatGenFD.c.o
[ 43%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/Euclid/Mem_dh.c.o
[ 43%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/Euclid/Numbering_dh.c.o
[ 43%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/Euclid/Parser_dh.c.o
[ 43%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/Euclid/shellSort_dh.c.o
[ 43%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/Euclid/sig_dh.c.o
[ 43%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/Euclid/SortedList_dh.c.o
[ 44%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/Euclid/SortedSet_dh.c.o
[ 44%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/Euclid/SubdomainGraph_dh.c.o
[ 44%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/Euclid/TimeLog_dh.c.o
[ 44%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/Euclid/Timer_dh.c.o
[ 44%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/Euclid/Vec_dh.c.o
[ 44%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/ParaSails/ConjGrad.c.o
[ 45%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/ParaSails/DiagScale.c.o
[ 45%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/ParaSails/FGmres.c.o
[ 45%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/ParaSails/Hash.c.o
[ 45%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/ParaSails/hypre_ParaSails.c.o
[ 45%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/ParaSails/LoadBal.c.o
[ 46%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/ParaSails/Matrix.c.o
[ 46%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/ParaSails/Mem.c.o
[ 46%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/ParaSails/Numbering.c.o
[ 46%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/ParaSails/OrderStat.c.o
Setting up intel-oneapi-tbb-common-2021.8.0 (2021.8.0-25334) ...
Setting up intel-oneapi-dpcpp-debugger-2023.0.0 (2023.0.0-25336) ...
Setting up intel-oneapi-compiler-cpp-eclipse-cfg (2024.0.2-49895) ...
Setting up intel-oneapi-openmp-2023.0.0 (2023.0.0-25370) ...
[ 46%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/ParaSails/ParaSails.c.o
[ 46%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/ParaSails/PrunedRows.c.o
find: '/opt/intel//oneapi/compiler/latest/lib/pkgconfig': No such file or directory
[ 47%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/ParaSails/RowPatt.c.o
[ 47%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/ParaSails/StoredRows.c.o
[ 47%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/pilut/comm.c.o
[ 47%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/pilut/debug.c.o
Setting up intel-oneapi-dev-utilities-2021.8.0 (2021.8.0-25328) ...
Setting up intel-oneapi-tbb-2021.8.0 (2021.8.0-25334) ...
[ 47%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/pilut/distributed_qsort.c.o
[ 47%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/pilut/distributed_qsort_si.c.o
[ 48%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/pilut/HYPRE_DistributedMatrixPilutSolver.c.o
[ 48%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/pilut/ilut.c.o
[ 48%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/pilut/parilut.c.o
[ 48%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/pilut/parutil.c.o
[ 48%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/pilut/pblas1.c.o
[ 48%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/pilut/serilut.c.o
[ 49%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/pilut/trifactor.c.o
[ 49%] Building C object CMakeFiles/HYPRE.dir/distributed_ls/pilut/util.c.o
[ 49%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/amg_hybrid.c.o
[ 49%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/aux_interp.c.o
[ 49%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/F90_hypre_laplace.c.o
[ 50%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/F90_HYPRE_parcsr_amg.c.o
[ 50%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/F90_HYPRE_parcsr_bicgstab.c.o
[ 50%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/F90_HYPRE_parcsr_block.c.o
[ 50%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/F90_HYPRE_parcsr_cgnr.c.o
[ 50%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/F90_HYPRE_parcsr_Euclid.c.o
[ 50%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/F90_HYPRE_parcsr_gmres.c.o
[ 51%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/F90_HYPRE_parcsr_cogmres.c.o
[ 51%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/F90_HYPRE_parcsr_flexgmres.c.o
[ 51%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/F90_HYPRE_parcsr_lgmres.c.o
[ 51%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/F90_HYPRE_parcsr_hybrid.c.o
[ 51%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/F90_HYPRE_parcsr_int.c.o
[ 51%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/F90_HYPRE_parcsr_ParaSails.c.o
[ 52%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/F90_HYPRE_parcsr_pcg.c.o
Setting up intel-oneapi-compiler-dpcpp-eclipse-cfg (2024.0.2-49895) ...
Setting up intel-oneapi-compiler-dpcpp-cpp-common-2023.0.0 (2023.0.0-25370) ...
Setting up intel-oneapi-tbb-common-devel-2021.8.0 (2021.8.0-25334) ...
Setting up intel-oneapi-compiler-shared-runtime-2023.0.0 (2023.0.0-25370) ...
[ 52%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/F90_HYPRE_parcsr_pilut.c.o
[ 52%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/F90_HYPRE_parcsr_schwarz.c.o
[ 52%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/F90_HYPRE_ams.c.o
[ 52%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/gen_redcs_mat.c.o
[ 52%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/HYPRE_parcsr_amg.c.o
[ 53%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/HYPRE_parcsr_amgdd.c.o
[ 53%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/HYPRE_parcsr_bicgstab.c.o
[ 53%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/HYPRE_parcsr_block.c.o
[ 53%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/HYPRE_parcsr_cgnr.c.o
[ 53%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/HYPRE_parcsr_Euclid.c.o
[ 53%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/HYPRE_parcsr_gmres.c.o
[ 54%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/HYPRE_parcsr_cogmres.c.o
[ 54%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/HYPRE_parcsr_flexgmres.c.o
[ 54%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/HYPRE_parcsr_lgmres.c.o
[ 54%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/HYPRE_parcsr_hybrid.c.o
[ 54%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/HYPRE_parcsr_int.c.o
[ 55%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/HYPRE_parcsr_ilu.c.o
[ 55%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/HYPRE_parcsr_mgr.c.o
[ 55%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/HYPRE_parcsr_ParaSails.c.o
[ 55%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/HYPRE_parcsr_pcg.c.o
[ 55%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/HYPRE_parcsr_pilut.c.o
[ 55%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/HYPRE_parcsr_schwarz.c.o
[ 56%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/HYPRE_ams.c.o
[ 56%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/HYPRE_ads.c.o
[ 56%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/HYPRE_ame.c.o
[ 56%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_2s_interp.c.o
[ 56%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_amg.c.o
[ 56%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_amgdd.c.o
[ 57%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_amgdd_solve.c.o
[ 57%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_amgdd_comp_grid.c.o
[ 57%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_amgdd_helpers.c.o
[ 57%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_amgdd_fac_cycle.c.o
[ 57%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_amgdd_setup.c.o
[ 57%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_amg_setup.c.o
[ 58%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_amg_solve.c.o
[ 58%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_amg_solveT.c.o
[ 58%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_cg_relax_wt.c.o
[ 58%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_coarsen.c.o
[ 58%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_cgc_coarsen.c.o
[ 58%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_cheby.c.o
[ 59%] Building CUDA object CMakeFiles/HYPRE.dir/parcsr_ls/par_cheby_device.c.o
[ 59%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_coarse_parms.c.o
[ 59%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_coordinates.c.o
[ 59%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_cr.c.o
Setting up intel-oneapi-compiler-dpcpp-cpp-runtime-2023.0.0 (2023.0.0-25370) ...
Setting up intel-oneapi-compiler-shared-2023.0.0 (2023.0.0-25370) ...
Setting up intel-oneapi-tbb-devel-2021.8.0 (2021.8.0-25334) ...
[ 59%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_cycle.c.o
[ 60%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_add_cycle.c.o
[ 60%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_difconv.c.o
[ 60%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_gauss_elim.c.o
[ 60%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_gsmg.c.o
[ 60%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_indepset.c.o
[ 60%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_interp.c.o
[ 61%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_jacobi_interp.c.o
[ 61%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_krylov_func.c.o
[ 61%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_mod_lr_interp.c.o
[ 61%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_multi_interp.c.o
[ 61%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_laplace_27pt.c.o
[ 61%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_laplace_9pt.c.o
[ 62%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_laplace.c.o
[ 62%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_lr_interp.c.o
[ 62%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_mgr.c.o
[ 62%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_mgr_setup.c.o
[ 62%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_mgr_solve.c.o
[ 62%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_nongalerkin.c.o
[ 63%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_nodal_systems.c.o
[ 63%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_rap.c.o
[ 63%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_rap_communication.c.o
[ 63%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_rotate_7pt.c.o
[ 63%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_vardifconv.c.o
[ 64%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_vardifconv_rs.c.o
[ 64%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_relax.c.o
[ 64%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_relax_more.c.o
[ 64%] Building CUDA object CMakeFiles/HYPRE.dir/parcsr_ls/par_relax_more_device.c.o
[ 64%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_relax_interface.c.o
Setting up intel-oneapi-dpcpp-cpp-2023.0.0 (2023.0.0-25370) ...
[ 64%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_scaled_matnorm.c.o
[ 65%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_schwarz.c.o
[ 65%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_stats.c.o
[ 65%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_strength.c.o
Setting up intel-oneapi-compiler-dpcpp-cpp-2023.0.0 (2023.0.0-25370) ...
[ 65%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_sv_interp.c.o
[ 65%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_sv_interp_ln.c.o
[ 65%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/partial.c.o
[ 66%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/schwarz.c.o
[ 66%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/block_tridiag.c.o
[ 66%] Building CUDA object CMakeFiles/HYPRE.dir/parcsr_ls/ams.c.o
[ 66%] Building CUDA object CMakeFiles/HYPRE.dir/parcsr_ls/ads.c.o
[ 66%] Building CUDA object CMakeFiles/HYPRE.dir/parcsr_ls/ame.c.o
[ 66%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_restr.c.o
[ 67%] Building C object CMakeFiles/HYPRE.dir/parcsr_ls/par_lr_restr.c.o
[ 67%] Building CUDA object CMakeFiles/HYPRE.dir/parcsr_ls/par_ilu.c.o
[ 67%] Building CUDA object CMakeFiles/HYPRE.dir/parcsr_ls/par_ilu_setup.c.o
[ 67%] Building CUDA object CMakeFiles/HYPRE.dir/parcsr_ls/par_ilu_solve.c.o
[ 67%] Building CUDA object CMakeFiles/HYPRE.dir/parcsr_ls/par_coarsen_device.c.o
[ 67%] Building CUDA object CMakeFiles/HYPRE.dir/parcsr_ls/par_indepset_device.c.o
[ 68%] Building CUDA object CMakeFiles/HYPRE.dir/parcsr_ls/par_interp_device.c.o
[ 68%] Building CUDA object CMakeFiles/HYPRE.dir/parcsr_ls/par_interp_trunc_device.c.o
[ 68%] Building CUDA object CMakeFiles/HYPRE.dir/parcsr_ls/par_lr_interp_device.c.o
[ 68%] Building CUDA object CMakeFiles/HYPRE.dir/parcsr_ls/par_lr_restr_device.c.o
[ 68%] Building CUDA object CMakeFiles/HYPRE.dir/parcsr_ls/par_strength_device.c.o
[ 69%] Building CUDA object CMakeFiles/HYPRE.dir/parcsr_ls/par_strength2nd_device.c.o
[ 69%] Building CUDA object CMakeFiles/HYPRE.dir/parcsr_ls/par_amgdd_fac_cycle_device.c.o
[ 69%] Building CUDA object CMakeFiles/HYPRE.dir/parcsr_ls/par_2s_interp_device.c.o
[ 69%] Building CUDA object CMakeFiles/HYPRE.dir/parcsr_ls/par_relax_device.c.o
[ 69%] Building C object CMakeFiles/HYPRE.dir/struct_mv/assumed_part.c.o
[ 69%] Building C object CMakeFiles/HYPRE.dir/struct_mv/box_algebra.c.o
[ 70%] Building C object CMakeFiles/HYPRE.dir/struct_mv/box_boundary.c.o
[ 70%] Building C object CMakeFiles/HYPRE.dir/struct_mv/box.c.o
[ 70%] Building C object CMakeFiles/HYPRE.dir/struct_mv/box_manager.c.o
[ 70%] Building C object CMakeFiles/HYPRE.dir/struct_mv/communication_info.c.o
[ 70%] Building C object CMakeFiles/HYPRE.dir/struct_mv/computation.c.o
[ 70%] Building C object CMakeFiles/HYPRE.dir/struct_mv/F90_HYPRE_struct_grid.c.o
[ 71%] Building C object CMakeFiles/HYPRE.dir/struct_mv/F90_HYPRE_struct_matrix.c.o
[ 71%] Building C object CMakeFiles/HYPRE.dir/struct_mv/F90_HYPRE_struct_stencil.c.o
[ 71%] Building C object CMakeFiles/HYPRE.dir/struct_mv/F90_HYPRE_struct_vector.c.o
[ 71%] Building C object CMakeFiles/HYPRE.dir/struct_mv/HYPRE_struct_grid.c.o
[ 71%] Building C object CMakeFiles/HYPRE.dir/struct_mv/HYPRE_struct_matrix.c.o
[ 71%] Building C object CMakeFiles/HYPRE.dir/struct_mv/HYPRE_struct_stencil.c.o
[ 72%] Building C object CMakeFiles/HYPRE.dir/struct_mv/HYPRE_struct_vector.c.o
[ 72%] Building C object CMakeFiles/HYPRE.dir/struct_mv/project.c.o
[ 72%] Building CUDA object CMakeFiles/HYPRE.dir/struct_mv/struct_axpy.c.o
[ 72%] Building CUDA object CMakeFiles/HYPRE.dir/struct_mv/struct_communication.c.o
[ 72%] Building CUDA object CMakeFiles/HYPRE.dir/struct_mv/struct_copy.c.o
[ 73%] Building C object CMakeFiles/HYPRE.dir/struct_mv/struct_grid.c.o
[ 73%] Building CUDA object CMakeFiles/HYPRE.dir/struct_mv/struct_innerprod.c.o
[ 73%] Building C object CMakeFiles/HYPRE.dir/struct_mv/struct_io.c.o
[ 73%] Building CUDA object CMakeFiles/HYPRE.dir/struct_mv/struct_matrix.c.o
[ 73%] Building C object CMakeFiles/HYPRE.dir/struct_mv/struct_matrix_mask.c.o
[ 73%] Building CUDA object CMakeFiles/HYPRE.dir/struct_mv/struct_matvec.c.o
[ 74%] Building CUDA object CMakeFiles/HYPRE.dir/struct_mv/struct_scale.c.o
[ 74%] Building C object CMakeFiles/HYPRE.dir/struct_mv/struct_stencil.c.o
[ 74%] Building CUDA object CMakeFiles/HYPRE.dir/struct_mv/struct_vector.c.o
[ 74%] Building C object CMakeFiles/HYPRE.dir/struct_ls/coarsen.c.o
[ 74%] Building CUDA object CMakeFiles/HYPRE.dir/struct_ls/cyclic_reduction.c.o
[ 74%] Building C object CMakeFiles/HYPRE.dir/struct_ls/F90_HYPRE_struct_bicgstab.c.o
[ 75%] Building C object CMakeFiles/HYPRE.dir/struct_ls/F90_HYPRE_struct_cycred.c.o
[ 75%] Building C object CMakeFiles/HYPRE.dir/struct_ls/F90_HYPRE_struct_gmres.c.o
[ 75%] Building C object CMakeFiles/HYPRE.dir/struct_ls/F90_HYPRE_struct_hybrid.c.o
[ 75%] Building C object CMakeFiles/HYPRE.dir/struct_ls/F90_HYPRE_struct_int.c.o
[ 75%] Building C object CMakeFiles/HYPRE.dir/struct_ls/F90_HYPRE_struct_jacobi.c.o
[ 75%] Building C object CMakeFiles/HYPRE.dir/struct_ls/F90_HYPRE_struct_pcg.c.o
[ 76%] Building C object CMakeFiles/HYPRE.dir/struct_ls/F90_HYPRE_struct_pfmg.c.o
[ 76%] Building C object CMakeFiles/HYPRE.dir/struct_ls/F90_HYPRE_struct_smg.c.o
[ 76%] Building C object CMakeFiles/HYPRE.dir/struct_ls/F90_HYPRE_struct_sparse_msg.c.o
[ 76%] Building C object CMakeFiles/HYPRE.dir/struct_ls/hybrid.c.o
[ 76%] Building C object CMakeFiles/HYPRE.dir/struct_ls/HYPRE_struct_bicgstab.c.o
[ 76%] Building C object CMakeFiles/HYPRE.dir/struct_ls/HYPRE_struct_cycred.c.o
[ 77%] Building C object CMakeFiles/HYPRE.dir/struct_ls/HYPRE_struct_hybrid.c.o
[ 77%] Building CUDA object CMakeFiles/HYPRE.dir/struct_ls/HYPRE_struct_int.c.o
[ 77%] Building C object CMakeFiles/HYPRE.dir/struct_ls/HYPRE_struct_jacobi.c.o
[ 77%] Building C object CMakeFiles/HYPRE.dir/struct_ls/HYPRE_struct_pfmg.c.o
[ 77%] Building C object CMakeFiles/HYPRE.dir/struct_ls/HYPRE_struct_smg.c.o
[ 78%] Building C object CMakeFiles/HYPRE.dir/struct_ls/HYPRE_struct_sparse_msg.c.o
[ 78%] Building CUDA object CMakeFiles/HYPRE.dir/struct_ls/HYPRE_struct_pcg.c.o
[ 78%] Building C object CMakeFiles/HYPRE.dir/struct_ls/HYPRE_struct_gmres.c.o
[ 78%] Building C object CMakeFiles/HYPRE.dir/struct_ls/HYPRE_struct_flexgmres.c.o
[ 78%] Building C object CMakeFiles/HYPRE.dir/struct_ls/HYPRE_struct_lgmres.c.o
[ 78%] Building C object CMakeFiles/HYPRE.dir/struct_ls/jacobi.c.o
[ 79%] Building C object CMakeFiles/HYPRE.dir/struct_ls/pcg_struct.c.o
[ 79%] Building CUDA object CMakeFiles/HYPRE.dir/struct_ls/pfmg2_setup_rap.c.o
[ 79%] Building CUDA object CMakeFiles/HYPRE.dir/struct_ls/pfmg3_setup_rap.c.o
[ 79%] Building C object CMakeFiles/HYPRE.dir/struct_ls/pfmg.c.o
[ 79%] Building C object CMakeFiles/HYPRE.dir/struct_ls/pfmg_relax.c.o
[ 79%] Building CUDA object CMakeFiles/HYPRE.dir/struct_ls/pfmg_setup.c.o
[ 80%] Building CUDA object CMakeFiles/HYPRE.dir/struct_ls/pfmg_setup_interp.c.o
[ 80%] Building CUDA object CMakeFiles/HYPRE.dir/struct_ls/pfmg_setup_rap5.c.o
[ 80%] Building CUDA object CMakeFiles/HYPRE.dir/struct_ls/pfmg_setup_rap7.c.o
[ 80%] Building C object CMakeFiles/HYPRE.dir/struct_ls/pfmg_setup_rap.c.o
[ 80%] Building C object CMakeFiles/HYPRE.dir/struct_ls/pfmg_solve.c.o
[ 80%] Building CUDA object CMakeFiles/HYPRE.dir/struct_ls/point_relax.c.o
[ 81%] Building CUDA object CMakeFiles/HYPRE.dir/struct_ls/red_black_constantcoef_gs.c.o
[ 81%] Building CUDA object CMakeFiles/HYPRE.dir/struct_ls/red_black_gs.c.o
[ 81%] Building C object CMakeFiles/HYPRE.dir/struct_ls/semi.c.o
[ 81%] Building CUDA object CMakeFiles/HYPRE.dir/struct_ls/semi_interp.c.o
[ 81%] Building CUDA object CMakeFiles/HYPRE.dir/struct_ls/semi_restrict.c.o
[ 82%] Building CUDA object CMakeFiles/HYPRE.dir/struct_ls/semi_setup_rap.c.o
[ 82%] Building CUDA object CMakeFiles/HYPRE.dir/struct_ls/smg2_setup_rap.c.o
[ 82%] Building CUDA object CMakeFiles/HYPRE.dir/struct_ls/smg3_setup_rap.c.o
[ 82%] Building CUDA object CMakeFiles/HYPRE.dir/struct_ls/smg_axpy.c.o
[ 82%] Building CUDA object CMakeFiles/HYPRE.dir/struct_ls/smg.c.o
[ 82%] Building C object CMakeFiles/HYPRE.dir/struct_ls/smg_relax.c.o
[ 83%] Building CUDA object CMakeFiles/HYPRE.dir/struct_ls/smg_residual.c.o
[ 83%] Building C object CMakeFiles/HYPRE.dir/struct_ls/smg_setup.c.o
[ 83%] Building CUDA object CMakeFiles/HYPRE.dir/struct_ls/smg_setup_interp.c.o
[ 83%] Building C object CMakeFiles/HYPRE.dir/struct_ls/smg_setup_rap.c.o
[ 83%] Building C object CMakeFiles/HYPRE.dir/struct_ls/smg_setup_restrict.c.o
[ 83%] Building C object CMakeFiles/HYPRE.dir/struct_ls/smg_solve.c.o
[ 84%] Building CUDA object CMakeFiles/HYPRE.dir/struct_ls/sparse_msg2_setup_rap.c.o
[ 84%] Building CUDA object CMakeFiles/HYPRE.dir/struct_ls/sparse_msg3_setup_rap.c.o
[ 84%] Building C object CMakeFiles/HYPRE.dir/struct_ls/sparse_msg.c.o
[ 84%] Building CUDA object CMakeFiles/HYPRE.dir/struct_ls/sparse_msg_filter.c.o
[ 84%] Building CUDA object CMakeFiles/HYPRE.dir/struct_ls/sparse_msg_interp.c.o
[ 84%] Building CUDA object CMakeFiles/HYPRE.dir/struct_ls/sparse_msg_restrict.c.o
[ 85%] Building C object CMakeFiles/HYPRE.dir/struct_ls/sparse_msg_setup.c.o
[ 85%] Building C object CMakeFiles/HYPRE.dir/struct_ls/sparse_msg_setup_rap.c.o
[ 85%] Building C object CMakeFiles/HYPRE.dir/struct_ls/sparse_msg_solve.c.o
[ 85%] Building C object CMakeFiles/HYPRE.dir/sstruct_mv/F90_HYPRE_sstruct_graph.c.o
[ 85%] Building C object CMakeFiles/HYPRE.dir/sstruct_mv/F90_HYPRE_sstruct_grid.c.o
[ 85%] Building C object CMakeFiles/HYPRE.dir/sstruct_mv/F90_HYPRE_sstruct_matrix.c.o
[ 86%] Building C object CMakeFiles/HYPRE.dir/sstruct_mv/F90_HYPRE_sstruct_stencil.c.o
[ 86%] Building C object CMakeFiles/HYPRE.dir/sstruct_mv/F90_HYPRE_sstruct_vector.c.o
[ 86%] Building C object CMakeFiles/HYPRE.dir/sstruct_mv/HYPRE_sstruct_graph.c.o
[ 86%] Building C object CMakeFiles/HYPRE.dir/sstruct_mv/HYPRE_sstruct_grid.c.o
[ 86%] Building C object CMakeFiles/HYPRE.dir/sstruct_mv/HYPRE_sstruct_matrix.c.o
[ 87%] Building C object CMakeFiles/HYPRE.dir/sstruct_mv/HYPRE_sstruct_stencil.c.o
[ 87%] Building C object CMakeFiles/HYPRE.dir/sstruct_mv/HYPRE_sstruct_vector.c.o
[ 87%] Building C object CMakeFiles/HYPRE.dir/sstruct_mv/sstruct_axpy.c.o
[ 87%] Building C object CMakeFiles/HYPRE.dir/sstruct_mv/sstruct_copy.c.o
[ 87%] Building C object CMakeFiles/HYPRE.dir/sstruct_mv/sstruct_graph.c.o
[ 87%] Building C object CMakeFiles/HYPRE.dir/sstruct_mv/sstruct_grid.c.o
[ 88%] Building C object CMakeFiles/HYPRE.dir/sstruct_mv/sstruct_innerprod.c.o
[ 88%] Building CUDA object CMakeFiles/HYPRE.dir/sstruct_mv/sstruct_matrix.c.o
[ 88%] Building C object CMakeFiles/HYPRE.dir/sstruct_mv/sstruct_matvec.c.o
[ 88%] Building C object CMakeFiles/HYPRE.dir/sstruct_mv/sstruct_scale.c.o
[ 88%] Building C object CMakeFiles/HYPRE.dir/sstruct_mv/sstruct_stencil.c.o
[ 88%] Building CUDA object CMakeFiles/HYPRE.dir/sstruct_mv/sstruct_vector.c.o
[ 89%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/F90_HYPRE_sstruct_bicgstab.c.o
[ 89%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/F90_HYPRE_sstruct_gmres.c.o
[ 89%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/F90_HYPRE_sstruct_flexgmres.c.o
[ 89%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/F90_HYPRE_sstruct_lgmres.c.o
[ 89%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/F90_HYPRE_sstruct_InterFAC.c.o
[ 89%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/F90_HYPRE_sstruct_int.c.o
[ 90%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/F90_HYPRE_sstruct_maxwell.c.o
[ 90%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/F90_HYPRE_sstruct_pcg.c.o
[ 90%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/F90_HYPRE_sstruct_split.c.o
[ 90%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/F90_HYPRE_sstruct_sys_pfmg.c.o
[ 90%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/HYPRE_sstruct_bicgstab.c.o
[ 91%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/HYPRE_sstruct_gmres.c.o
[ 91%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/HYPRE_sstruct_flexgmres.c.o
[ 91%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/HYPRE_sstruct_lgmres.c.o
[ 91%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/HYPRE_sstruct_InterFAC.c.o
[ 91%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/HYPRE_sstruct_int.c.o
[ 91%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/HYPRE_sstruct_maxwell.c.o
[ 92%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/HYPRE_sstruct_pcg.c.o
[ 92%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/HYPRE_sstruct_split.c.o
[ 92%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/HYPRE_sstruct_sys_pfmg.c.o
[ 92%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/bsearch.c.o
[ 92%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/fac.c.o
[ 92%] Building CUDA object CMakeFiles/HYPRE.dir/sstruct_ls/fac_amr_rap.c.o
[ 93%] Building CUDA object CMakeFiles/HYPRE.dir/sstruct_ls/fac_amr_fcoarsen.c.o
[ 93%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/fac_amr_zero_data.c.o
[ 93%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/fac_cf_coarsen.c.o
[ 93%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/fac_cfstencil_box.c.o
[ 93%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/fac_CFInterfaceExtents.c.o
[ 93%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/fac_interp2.c.o
[ 94%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/fac_relax.c.o
[ 94%] Building CUDA object CMakeFiles/HYPRE.dir/sstruct_ls/fac_restrict2.c.o
[ 94%] Building CUDA object CMakeFiles/HYPRE.dir/sstruct_ls/fac_setup2.c.o
[ 94%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/fac_solve3.c.o
[ 94%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/fac_zero_cdata.c.o
[ 94%] Building CUDA object CMakeFiles/HYPRE.dir/sstruct_ls/fac_zero_stencilcoef.c.o
[ 95%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/krylov.c.o
[ 95%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/krylov_sstruct.c.o
[ 95%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/eliminate_rowscols.c.o
[ 95%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/maxwell_grad.c.o
[ 95%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/maxwell_physbdy.c.o
[ 96%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/maxwell_PNedelec.c.o
[ 96%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/maxwell_PNedelec_bdy.c.o
[ 96%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/maxwell_semi_interp.c.o
[ 96%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/maxwell_solve.c.o
[ 96%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/maxwell_solve2.c.o
[ 96%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/maxwell_TV.c.o
[ 97%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/maxwell_TV_setup.c.o
[ 97%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/maxwell_zeroBC.c.o
[ 97%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/nd1_amge_interpolation.c.o
[ 97%] Building CUDA object CMakeFiles/HYPRE.dir/sstruct_ls/node_relax.c.o
[ 97%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/sstruct_amr_intercommunication.c.o
[ 97%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/sstruct_owninfo.c.o
[ 98%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/sstruct_recvinfo.c.o
[ 98%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/sstruct_sendinfo.c.o
[ 98%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/sstruct_sharedDOFComm.c.o
[ 98%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/sys_pfmg.c.o
[ 98%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/sys_pfmg_relax.c.o
[ 98%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/sys_pfmg_setup.c.o
[ 99%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/sys_pfmg_setup_interp.c.o
[ 99%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/sys_pfmg_setup_rap.c.o
[ 99%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/sys_pfmg_solve.c.o
[ 99%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/sys_semi_interp.c.o
[ 99%] Building C object CMakeFiles/HYPRE.dir/sstruct_ls/sys_semi_restrict.c.o
[100%] Linking CUDA static library libHYPRE.a
[100%] Built target HYPRE
Install the project...
-- Install configuration: "Debug"
-- Installing: /opt/hypre/lib/libHYPRE.a
-- Installing: /opt/hypre/include/HYPRE_config.h
-- Installing: /opt/hypre/include/HYPREf.h
-- Installing: /opt/hypre/include/HYPRE.h
-- Installing: /opt/hypre/include/_hypre_blas.h
-- Installing: /opt/hypre/include/f2c.h
-- Installing: /opt/hypre/include/hypre_blas.h
-- Installing: /opt/hypre/include/_hypre_lapack.h
-- Installing: /opt/hypre/include/HYPRE_utilities.h
-- Installing: /opt/hypre/include/_hypre_utilities.h
-- Installing: /opt/hypre/include/HYPRE_error_f.h
-- Installing: /opt/hypre/include/fortran.h
-- Installing: /opt/hypre/include/fortran_matrix.h
-- Installing: /opt/hypre/include/csr_matmultivec.h
-- Installing: /opt/hypre/include/interpreter.h
-- Installing: /opt/hypre/include/multivector.h
-- Installing: /opt/hypre/include/par_csr_matmultivec.h
-- Installing: /opt/hypre/include/par_csr_pmvcomm.h
-- Installing: /opt/hypre/include/par_multivector.h
-- Installing: /opt/hypre/include/seq_multivector.h
-- Installing: /opt/hypre/include/temp_multivector.h
-- Installing: /opt/hypre/include/HYPRE_krylov.h
-- Installing: /opt/hypre/include/HYPRE_lobpcg.h
-- Installing: /opt/hypre/include/HYPRE_MatvecFunctions.h
-- Installing: /opt/hypre/include/krylov.h
-- Installing: /opt/hypre/include/lobpcg.h
-- Installing: /opt/hypre/include/HYPRE_seq_mv.h
-- Installing: /opt/hypre/include/seq_mv.h
-- Installing: /opt/hypre/include/HYPRE_parcsr_mv.h
-- Installing: /opt/hypre/include/_hypre_parcsr_mv.h
-- Installing: /opt/hypre/include/par_csr_block_matrix.h
-- Installing: /opt/hypre/include/csr_block_matrix.h
-- Installing: /opt/hypre/include/distributed_matrix.h
-- Installing: /opt/hypre/include/HYPRE_distributed_matrix_mv.h
-- Installing: /opt/hypre/include/HYPRE_distributed_matrix_protos.h
-- Installing: /opt/hypre/include/HYPRE_distributed_matrix_types.h
-- Installing: /opt/hypre/include/aux_parcsr_matrix.h
-- Installing: /opt/hypre/include/aux_par_vector.h
-- Installing: /opt/hypre/include/HYPRE_IJ_mv.h
-- Installing: /opt/hypre/include/_hypre_IJ_mv.h
-- Installing: /opt/hypre/include/IJ_matrix.h
-- Installing: /opt/hypre/include/IJ_vector.h
-- Installing: /opt/hypre/include/HYPRE_matrix_matrix_protos.h
-- Installing: /opt/hypre/include/HYPRE_DistributedMatrixPilutSolver_protos.h
-- Installing: /opt/hypre/include/HYPRE_DistributedMatrixPilutSolver_types.h
-- Installing: /opt/hypre/include/HYPRE_parcsr_ls.h
-- Installing: /opt/hypre/include/_hypre_parcsr_ls.h
-- Installing: /opt/hypre/include/HYPRE_struct_mv.h
-- Installing: /opt/hypre/include/_hypre_struct_mv.h
-- Installing: /opt/hypre/include/HYPRE_struct_ls.h
-- Installing: /opt/hypre/include/_hypre_struct_ls.h
-- Installing: /opt/hypre/include/HYPRE_sstruct_mv.h
-- Installing: /opt/hypre/include/_hypre_sstruct_mv.h
-- Installing: /opt/hypre/include/HYPRE_sstruct_ls.h
-- Installing: /opt/hypre/include/_hypre_sstruct_ls.h
-- Installing: /opt/hypre/lib/cmake/HYPRE/HYPRETargets.cmake
-- Installing: /opt/hypre/lib/cmake/HYPRE/HYPRETargets-debug.cmake
-- Installing: /opt/hypre/lib/cmake/HYPRE/HYPREConfig.cmake
-- Installing: /opt/hypre/lib/cmake/HYPRE/HYPREConfigVersion.cmake
 ---> Removed intermediate container 3bea040edc42
 ---> 729fe8e9764b
Successfully built 729fe8e9764b
Successfully tagged 578e5e7aaf741e8041f57a4fbc9d1613efeacbf3:latest
[Pipeline] isUnix
[Pipeline] withEnv
[Pipeline] {
[Pipeline] sh
+ docker inspect -f . 578e5e7aaf741e8041f57a4fbc9d1613efeacbf3
.
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] withDockerContainer
fetnat06 seems to be running inside container d95ba1cc990ac8a12fa18becb48576632d11b2e32b0863d77c8b8aa8006052f4
$ docker run -t -d -u 0:0 -v /tmp/ccache.kokkos:/tmp/ccache --env NVIDIA_VISIBLE_DEVICES=$NVIDIA_VISIBLE_DEVICES -w /var/jenkins/workspace/Cabana_PR-743 --volumes-from d95ba1cc990ac8a12fa18becb48576632d11b2e32b0863d77c8b8aa8006052f4 -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** 578e5e7aaf741e8041f57a4fbc9d1613efeacbf3 cat
$ docker top 7f3e2cf1b4fdb51ff8bc8f4361b02a7b6af716ea9f03ba03564695daaaa6242c -eo pid,comm
[Pipeline] {
[Pipeline] sh
+ ccache --zero-stats
Statistics zeroed
[Pipeline] sh
+ rm -rf build
+ mkdir -p build
+ cd build
+ cmake -D CMAKE_BUILD_TYPE=Debug -D CMAKE_CXX_COMPILER=/opt/kokkos/bin/nvcc_wrapper -D CMAKE_CXX_COMPILER_LAUNCHER=ccache -D CMAKE_CXX_FLAGS=-Wall -Wextra -Wpedantic -Werror -D CMAKE_PREFIX_PATH=/opt/kokkos;/opt/arborx;/opt/heffte;/opt/hypre -D MPIEXEC_MAX_NUMPROCS=1 -D MPIEXEC_PREFLAGS=--allow-run-as-root;--mca;btl_smcuda_use_cuda_ipc;0 -D Cabana_REQUIRE_MPI=ON -D Cabana_REQUIRE_ARBORX=ON -D Cabana_REQUIRE_HEFFTE=ON -D Cabana_REQUIRE_CUDA=ON -D Cabana_ENABLE_TESTING=ON -D Cabana_ENABLE_PERFORMANCE_TESTING=ON -D Cabana_ENABLE_EXAMPLES=ON ..
-- The CXX compiler identification is GNU 9.4.0
-- Check for working CXX compiler: /opt/kokkos/bin/nvcc_wrapper
-- Check for working CXX compiler: /opt/kokkos/bin/nvcc_wrapper -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Enabled Kokkos devices: CUDA;SERIAL
CMake Warning at /opt/kokkos/lib/cmake/Kokkos/KokkosConfigCommon.cmake:58 (MESSAGE):
  The installed Kokkos configuration does not support CXX extensions.
  Forcing -DCMAKE_CXX_EXTENSIONS=Off
Call Stack (most recent call first):
  /opt/kokkos/lib/cmake/Kokkos/KokkosConfig.cmake:56 (INCLUDE)
  CMakeLists.txt:39 (find_package)


-- Found Kokkos_DEVICES: CUDA  
-- Found Kokkos_OPTIONS: CUDA_LAMBDA  
-- Found MPI_CXX: /opt/openmpi/lib/libmpi.so (found version "3.1") 
-- Found MPI: TRUE (found version "3.1")  
-- Could NOT find CLANG_FORMAT: Found unsuitable version "0.0", but required is at least "14" (found CLANG_FORMAT_EXECUTABLE-NOTFOUND)
-- The C compiler identification is GNU 9.4.0
-- Check for working C compiler: /usr/bin/cc
-- Check for working C compiler: /usr/bin/cc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Detecting C compile features
-- Detecting C compile features - done
-- Looking for pthread.h
-- Looking for pthread.h - found
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Failed
-- Looking for pthread_create in pthreads
-- Looking for pthread_create in pthreads - not found
-- Looking for pthread_create in pthread
-- Looking for pthread_create in pthread - found
-- Found Threads: TRUE  
-- Found CUDA: /usr/local/cuda (found version "11.0") 
-- Could NOT find Git (missing: GIT_EXECUTABLE) 
-- Cabana Revision = 'Not a git repository'
-- Found GTest: /usr/lib/x86_64-linux-gnu/libgtest.a (Required is at least version "1.10") 
-- Could NOT find VALGRIND (missing: VALGRIND_EXECUTABLE) 
-- Could NOT find Doxygen (missing: DOXYGEN_EXECUTABLE) 
-- Could NOT find VALGRIND (missing: VALGRIND_EXECUTABLE) 
-- Could NOT find VALGRIND (missing: VALGRIND_EXECUTABLE) 
-- Configuring done
-- Generating done
-- Build files have been written to: /var/jenkins/workspace/Cabana_PR-743/build
+ make -j8
Scanning dependencies of target Cabana_Slice_test_CUDA
Scanning dependencies of target Cabana_CommunicationPlan_MPI_test_CUDA
Scanning dependencies of target Cabana_Parallel_test_CUDA
Scanning dependencies of target Cabana_ParameterPack_test_CUDA
Scanning dependencies of target Cabana_NeighborList_test_CUDA
Scanning dependencies of target Cabana_Halo_MPI_test_CUDA_UVM
Scanning dependencies of target Cabana_NeighborList_test_SERIAL
[  0%] Building CXX object core/unit_test/CMakeFiles/Cabana_CommunicationPlan_MPI_test_CUDA.dir/CUDA/tstCommunicationPlan_CUDA.cpp.o
Scanning dependencies of target Cabana_Distributor_MPI_test_CUDA_UVM
[  0%] Building CXX object core/unit_test/CMakeFiles/Cabana_Slice_test_CUDA.dir/CUDA/tstSlice_CUDA.cpp.o
[  1%] Building CXX object core/unit_test/CMakeFiles/Cabana_Parallel_test_CUDA.dir/CUDA/tstParallel_CUDA.cpp.o
[  1%] Building CXX object core/unit_test/CMakeFiles/Cabana_ParameterPack_test_CUDA.dir/CUDA/tstParameterPack_CUDA.cpp.o
[  1%] Building CXX object core/unit_test/CMakeFiles/Cabana_Halo_MPI_test_CUDA_UVM.dir/CUDA_UVM/tstHalo_CUDA_UVM.cpp.o
[  1%] Building CXX object core/unit_test/CMakeFiles/Cabana_NeighborList_test_CUDA.dir/CUDA/tstNeighborList_CUDA.cpp.o
[  1%] Building CXX object core/unit_test/CMakeFiles/Cabana_Distributor_MPI_test_CUDA_UVM.dir/CUDA_UVM/tstDistributor_CUDA_UVM.cpp.o
[  1%] Building CXX object core/unit_test/CMakeFiles/Cabana_NeighborList_test_SERIAL.dir/SERIAL/tstNeighborList_SERIAL.cpp.o
[  1%] Building CXX object core/unit_test/CMakeFiles/Cabana_ParameterPack_test_CUDA.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
[  2%] Building CXX object core/unit_test/CMakeFiles/Cabana_Slice_test_CUDA.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
[  2%] Building CXX object core/unit_test/CMakeFiles/Cabana_Parallel_test_CUDA.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
[  2%] Building CXX object core/unit_test/CMakeFiles/Cabana_CommunicationPlan_MPI_test_CUDA.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[  2%] Building CXX object core/unit_test/CMakeFiles/Cabana_Distributor_MPI_test_CUDA_UVM.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[  2%] Building CXX object core/unit_test/CMakeFiles/Cabana_Halo_MPI_test_CUDA_UVM.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[  2%] Linking CXX executable Cabana_Slice_test_CUDA
[  3%] Linking CXX executable Cabana_ParameterPack_test_CUDA
 ---> Removed intermediate container 328177143f95
 ---> 76fe7cd8a01d
Step 13/18 : ENV DPCPP=/opt/intel/oneapi/compiler/${DPCPP_VERSION}/linux/bin-llvm/clang++
 ---> Running in 86343d5a073f
[  3%] Built target Cabana_Slice_test_CUDA
Scanning dependencies of target Cabana_Tuple_test_SERIAL
[  3%] Building CXX object core/unit_test/CMakeFiles/Cabana_Tuple_test_SERIAL.dir/SERIAL/tstTuple_SERIAL.cpp.o
[  3%] Built target Cabana_ParameterPack_test_CUDA
[  3%] Linking CXX executable Cabana_Parallel_test_CUDA
Scanning dependencies of target Cabana_Slice_test_SERIAL
[  3%] Building CXX object core/unit_test/CMakeFiles/Cabana_Slice_test_SERIAL.dir/SERIAL/tstSlice_SERIAL.cpp.o
[  3%] Built target Cabana_Parallel_test_CUDA
Scanning dependencies of target Cabana_DeepCopy_test_CUDA_UVM
[  3%] Building CXX object core/unit_test/CMakeFiles/Cabana_DeepCopy_test_CUDA_UVM.dir/CUDA_UVM/tstDeepCopy_CUDA_UVM.cpp.o
[  4%] Linking CXX executable Cabana_CommunicationPlan_MPI_test_CUDA
[  4%] Built target Cabana_CommunicationPlan_MPI_test_CUDA
[  4%] Building CXX object core/unit_test/CMakeFiles/Cabana_NeighborList_test_SERIAL.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
Scanning dependencies of target Cabana_LinkedCellList_test_CUDA_UVM
[  4%] Building CXX object core/unit_test/CMakeFiles/Cabana_LinkedCellList_test_CUDA_UVM.dir/CUDA_UVM/tstLinkedCellList_CUDA_UVM.cpp.o
[  5%] Linking CXX executable Cabana_Distributor_MPI_test_CUDA_UVM
[  5%] Linking CXX executable Cabana_NeighborList_test_SERIAL
[  5%] Linking CXX executable Cabana_Halo_MPI_test_CUDA_UVM
[  5%] Built target Cabana_NeighborList_test_SERIAL
[  5%] Built target Cabana_Distributor_MPI_test_CUDA_UVM
Scanning dependencies of target Cabana_SoA_test
[  5%] Built target Cabana_Halo_MPI_test_CUDA_UVM
Scanning dependencies of target Cabana_AoSoA_test_CUDA
[  5%] Building CXX object core/unit_test/CMakeFiles/Cabana_AoSoA_test_CUDA.dir/CUDA/tstAoSoA_CUDA.cpp.o
[  5%] Building CXX object core/unit_test/CMakeFiles/Cabana_SoA_test.dir/tstSoA.cpp.o
Scanning dependencies of target Cabana_CommunicationPlan_MPI_test_CUDA_UVM
[  5%] Building CXX object core/unit_test/CMakeFiles/Cabana_CommunicationPlan_MPI_test_CUDA_UVM.dir/CUDA_UVM/tstCommunicationPlan_CUDA_UVM.cpp.o
[  6%] Building CXX object core/unit_test/CMakeFiles/Cabana_Tuple_test_SERIAL.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
[  7%] Building CXX object core/unit_test/CMakeFiles/Cabana_Slice_test_SERIAL.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
[  7%] Linking CXX executable Cabana_Slice_test_SERIAL
[  7%] Linking CXX executable Cabana_Tuple_test_SERIAL
[  7%] Built target Cabana_Slice_test_SERIAL
[  7%] Built target Cabana_Tuple_test_SERIAL
Scanning dependencies of target Cabana_CartesianGrid_test
Scanning dependencies of target Cabana_ParticleInit_test_CUDA_UVM
[  8%] Building CXX object core/unit_test/CMakeFiles/Cabana_CartesianGrid_test.dir/tstCartesianGrid.cpp.o
[  9%] Building CXX object core/unit_test/CMakeFiles/Cabana_ParticleInit_test_CUDA_UVM.dir/CUDA_UVM/tstParticleInit_CUDA_UVM.cpp.o
[  9%] Building CXX object core/unit_test/CMakeFiles/Cabana_SoA_test.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
 ---> Removed intermediate container 86343d5a073f
 ---> b7ac7b9bb9e4
Step 14/18 : RUN wget https://cloud.cees.ornl.gov/download/oneapi-for-nvidia-gpus-${DPCPP_VERSION}-linux.sh &&     chmod +x oneapi-for-nvidia-gpus-${DPCPP_VERSION}-linux.sh &&     ./oneapi-for-nvidia-gpus-${DPCPP_VERSION}-linux.sh -y &&     rm oneapi-for-nvidia-gpus-${DPCPP_VERSION}-linux.sh
 ---> Running in 9c588a7d954d
[  9%] Linking CXX executable Cabana_SoA_test
--2024-03-21 19:55:21--  https://cloud.cees.ornl.gov/download/oneapi-for-nvidia-gpus-2023.0.0-linux.sh
Resolving cloud.cees.ornl.gov (cloud.cees.ornl.gov)... 128.219.185.170
Connecting to cloud.cees.ornl.gov (cloud.cees.ornl.gov)|128.219.185.170|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 3048760 (2.9M) [application/octet-stream]
Saving to: 'oneapi-for-nvidia-gpus-2023.0.0-linux.sh'

     0K .......... .......... .......... .......... ..........  1% 35.0M 0s
    50K .......... .......... .......... .......... ..........  3% 32.9M 0s
   100K .......... .......... .......... .......... ..........  5% 12.6M 0s
   150K .......... .......... .......... .......... ..........  6% 32.7M 0s
   200K .......... .......... .......... .......... ..........  8% 35.4M 0s
   250K .......... .......... .......... .......... .......... 10% 35.9M 0s
   300K .......... .......... .......... .......... .......... 11% 31.7M 0s
   350K .......... .......... .......... .......... .......... 13% 30.7M 0s
   400K .......... .......... .......... .......... .......... 15% 34.1M 0s
   450K .......... .......... .......... .......... .......... 16% 36.4M 0s
   500K .......... .......... .......... .......... .......... 18% 37.6M 0s
   550K .......... .......... .......... .......... .......... 20% 32.3M 0s
   600K .......... .......... .......... .......... .......... 21% 32.6M 0s
   650K .......... .......... .......... .......... .......... 23% 35.1M 0s
   700K .......... .......... .......... .......... .......... 25% 34.6M 0s
   750K .......... .......... .......... .......... .......... 26% 29.7M 0s
   800K .......... .......... .......... .......... .......... 28% 35.9M 0s
   850K .......... .......... .......... .......... .......... 30% 42.0M 0s
   900K .......... .......... .......... .......... .......... 31% 43.0M 0s
   950K .......... .......... .......... .......... .......... 33% 37.3M 0s
  1000K .......... .......... .......... .......... .......... 35% 36.0M 0s
  1050K .......... .......... .......... .......... .......... 36% 38.2M 0s
  1100K .......... .......... .......... .......... .......... 38% 38.7M 0s
  1150K .......... .......... .......... .......... .......... 40% 37.2M 0s
  1200K .......... .......... .......... .......... .......... 41% 38.1M 0s
  1250K .......... .......... .......... .......... .......... 43% 43.3M 0s
  1300K .......... .......... .......... .......... .......... 45% 47.1M 0s
  1350K .......... .......... .......... .......... .......... 47% 44.8M 0s
  1400K .......... .......... .......... .......... .......... 48% 47.9M 0s
  1450K .......... .......... .......... .......... .......... 50% 36.6M 0s
  1500K .......... .......... .......... .......... .......... 52% 57.1M 0s
  1550K .......... .......... .......... .......... .......... 53% 49.2M 0s
  1600K .......... .......... .......... .......... .......... 55% 57.1M 0s
  1650K .......... .......... .......... .......... .......... 57% 49.1M 0s
  1700K .......... .......... .......... .......... .......... 58% 53.0M 0s
  1750K .......... .......... .......... .......... .......... 60% 55.4M 0s
  1800K .......... .......... .......... .......... .......... 62% 39.7M 0s
  1850K .......... .......... .......... .......... .......... 63% 35.2M 0s
  1900K .......... .......... .......... .......... .......... 65% 39.5M 0s
  1950K .......... .......... .......... .......... .......... 67% 36.6M 0s
  2000K .......... .......... .......... .......... .......... 68% 38.9M 0s
  2050K .......... .......... .......... .......... .......... 70% 34.6M 0s
  2100K .......... .......... .......... .......... .......... 72% 43.1M 0s
  2150K .......... .......... .......... .......... .......... 73% 36.0M 0s
  2200K .......... .......... .......... .......... .......... 75% 40.6M 0s
  2250K .......... .......... .......... .......... .......... 77% 33.1M 0s
  2300K .......... .......... .......... .......... .......... 78% 39.3M 0s
  2350K .......... .......... .......... .......... .......... 80% 35.4M 0s
  2400K .......... .......... .......... .......... .......... 82% 40.6M 0s
  2450K .......... .......... .......... .......... .......... 83% 37.1M 0s
  2500K .......... .......... .......... .......... .......... 85% 39.7M 0s
  2550K .......... .......... .......... .......... .......... 87% 38.8M 0s
  2600K .......... .......... .......... .......... .......... 89% 53.5M 0s
  2650K .......... .......... .......... .......... .......... 90% 50.6M 0s
  2700K .......... .......... .......... .......... .......... 92% 51.2M 0s
  2750K .......... .......... .......... .......... .......... 94% 46.6M 0s
  2800K .......... .......... .......... .......... .......... 95% 46.9M 0s
  2850K .......... .......... .......... .......... .......... 97% 53.9M 0s
  2900K .......... .......... .......... .......... .......... 99% 53.1M 0s
  2950K .......... .......... .......                         100% 43.4M=0.08s

2024-03-21 19:55:22 (38.1 MB/s) - 'oneapi-for-nvidia-gpus-2023.0.0-linux.sh' saved [3048760/3048760]


oneAPI for NVIDIA GPUs 2023.0.0 installer

Found oneAPI DPC++/C++ Compiler 2023.0.0 in /opt/intel/oneapi/.

By installing this software, you accept the oneAPI for NVIDIA GPUs License Agreement.

* CUDA plugin library installed in /opt/intel/oneapi/compiler/2023.0.0/linux/lib/.
* License installed in /opt/intel/oneapi/licensing/2023.0.0/.
* Documentation installed in /opt/intel/oneapi/compiler/2023.0.0/documentation/en/oneAPI_for_NVIDIA_GPUs/.

Installation complete.

[  9%] Building CXX object core/unit_test/CMakeFiles/Cabana_NeighborList_test_CUDA.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
[  9%] Linking CXX executable Cabana_NeighborList_test_CUDA
[  9%] Built target Cabana_SoA_test
Scanning dependencies of target Cabana_ParticleInit_test_SERIAL
[  9%] Building CXX object core/unit_test/CMakeFiles/Cabana_ParticleInit_test_SERIAL.dir/SERIAL/tstParticleInit_SERIAL.cpp.o
[  9%] Built target Cabana_NeighborList_test_CUDA
[  9%] Building CXX object core/unit_test/CMakeFiles/Cabana_CartesianGrid_test.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
[  9%] Building CXX object core/unit_test/CMakeFiles/Cabana_LinkedCellList_test_CUDA_UVM.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
Scanning dependencies of target Cabana_Index_test
[  9%] Linking CXX executable Cabana_CartesianGrid_test
[  9%] Building CXX object core/unit_test/CMakeFiles/Cabana_Index_test.dir/tstIndex.cpp.o
[  9%] Built target Cabana_CartesianGrid_test
Scanning dependencies of target Cabana_Version_test
[  9%] Building CXX object core/unit_test/CMakeFiles/Cabana_Version_test.dir/tstVersion.cpp.o
[  9%] Linking CXX executable Cabana_LinkedCellList_test_CUDA_UVM
[  9%] Building CXX object core/unit_test/CMakeFiles/Cabana_CommunicationPlan_MPI_test_CUDA_UVM.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[  9%] Linking CXX executable Cabana_CommunicationPlan_MPI_test_CUDA_UVM
[  9%] Built target Cabana_LinkedCellList_test_CUDA_UVM
Scanning dependencies of target Cabana_NeighborListArborX_test_SERIAL
[  9%] Building CXX object core/unit_test/CMakeFiles/Cabana_NeighborListArborX_test_SERIAL.dir/SERIAL/tstNeighborListArborX_SERIAL.cpp.o
[  9%] Built target Cabana_CommunicationPlan_MPI_test_CUDA_UVM
Scanning dependencies of target Cabana_AoSoA_test_SERIAL
[  9%] Building CXX object core/unit_test/CMakeFiles/Cabana_AoSoA_test_SERIAL.dir/SERIAL/tstAoSoA_SERIAL.cpp.o
[  9%] Building CXX object core/unit_test/CMakeFiles/Cabana_AoSoA_test_CUDA.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
[  9%] Linking CXX executable Cabana_AoSoA_test_CUDA
[  9%] Building CXX object core/unit_test/CMakeFiles/Cabana_Version_test.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
[  9%] Linking CXX executable Cabana_Version_test
[  9%] Built target Cabana_AoSoA_test_CUDA
Scanning dependencies of target Cabana_DeepCopy_test_SERIAL
[  9%] Building CXX object core/unit_test/CMakeFiles/Cabana_DeepCopy_test_SERIAL.dir/SERIAL/tstDeepCopy_SERIAL.cpp.o
[  9%] Building CXX object core/unit_test/CMakeFiles/Cabana_Index_test.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
[  9%] Built target Cabana_Version_test
[  9%] Building CXX object core/unit_test/CMakeFiles/Cabana_ParticleInit_test_CUDA_UVM.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
[ 10%] Building CXX object core/unit_test/CMakeFiles/Cabana_DeepCopy_test_CUDA_UVM.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
[ 10%] Linking CXX executable Cabana_Index_test
[ 10%] Linking CXX executable Cabana_ParticleInit_test_CUDA_UVM
Scanning dependencies of target Cabana_LinkedCellList_test_CUDA
[ 11%] Building CXX object core/unit_test/CMakeFiles/Cabana_LinkedCellList_test_CUDA.dir/CUDA/tstLinkedCellList_CUDA.cpp.o
[ 11%] Built target Cabana_Index_test
[ 11%] Built target Cabana_ParticleInit_test_CUDA_UVM
Scanning dependencies of target Cabana_Distributor_MPI_test_CUDA
[ 12%] Building CXX object core/unit_test/CMakeFiles/Cabana_Distributor_MPI_test_CUDA.dir/CUDA/tstDistributor_CUDA.cpp.o
Scanning dependencies of target Cabana_Halo_MPI_test_CUDA
[ 12%] Building CXX object core/unit_test/CMakeFiles/Cabana_Halo_MPI_test_CUDA.dir/CUDA/tstHalo_CUDA.cpp.o
[ 12%] Building CXX object core/unit_test/CMakeFiles/Cabana_ParticleInit_test_SERIAL.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
[ 12%] Linking CXX executable Cabana_ParticleInit_test_SERIAL
[ 12%] Linking CXX executable Cabana_DeepCopy_test_CUDA_UVM
[ 12%] Built target Cabana_ParticleInit_test_SERIAL
Scanning dependencies of target Cabana_DeepCopy_test_CUDA
[ 12%] Building CXX object core/unit_test/CMakeFiles/Cabana_DeepCopy_test_CUDA.dir/CUDA/tstDeepCopy_CUDA.cpp.o
 ---> Removed intermediate container 9c588a7d954d
 ---> 073d3e520bee
Step 15/18 : ARG KOKKOS_VERSION=4.1.00
 ---> Running in b24937d0afa1
[ 12%] Built target Cabana_DeepCopy_test_CUDA_UVM
Scanning dependencies of target Cabana_LinkedCellList_test_SERIAL
[ 13%] Building CXX object core/unit_test/CMakeFiles/Cabana_LinkedCellList_test_SERIAL.dir/SERIAL/tstLinkedCellList_SERIAL.cpp.o
[ 13%] Building CXX object core/unit_test/CMakeFiles/Cabana_AoSoA_test_SERIAL.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
[ 13%] Linking CXX executable Cabana_AoSoA_test_SERIAL
[ 13%] Built target Cabana_AoSoA_test_SERIAL
Scanning dependencies of target Cabana_ParameterPack_test_SERIAL
[ 13%] Building CXX object core/unit_test/CMakeFiles/Cabana_ParameterPack_test_SERIAL.dir/SERIAL/tstParameterPack_SERIAL.cpp.o
[ 13%] Building CXX object core/unit_test/CMakeFiles/Cabana_DeepCopy_test_SERIAL.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
[ 13%] Linking CXX executable Cabana_DeepCopy_test_SERIAL
[ 13%] Building CXX object core/unit_test/CMakeFiles/Cabana_LinkedCellList_test_CUDA.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
[ 13%] Linking CXX executable Cabana_LinkedCellList_test_CUDA
[ 13%] Built target Cabana_DeepCopy_test_SERIAL
Scanning dependencies of target Cabana_Parallel_test_SERIAL
[ 13%] Building CXX object core/unit_test/CMakeFiles/Cabana_Parallel_test_SERIAL.dir/SERIAL/tstParallel_SERIAL.cpp.o
[ 13%] Built target Cabana_LinkedCellList_test_CUDA
Scanning dependencies of target Cabana_ParticleInit_test_CUDA
[ 13%] Building CXX object core/unit_test/CMakeFiles/Cabana_ParticleInit_test_CUDA.dir/CUDA/tstParticleInit_CUDA.cpp.o
[ 13%] Building CXX object core/unit_test/CMakeFiles/Cabana_Distributor_MPI_test_CUDA.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 13%] Linking CXX executable Cabana_Distributor_MPI_test_CUDA
[ 13%] Building CXX object core/unit_test/CMakeFiles/Cabana_LinkedCellList_test_SERIAL.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
[ 13%] Linking CXX executable Cabana_LinkedCellList_test_SERIAL
[ 13%] Built target Cabana_Distributor_MPI_test_CUDA
[ 13%] Building CXX object core/unit_test/CMakeFiles/Cabana_Halo_MPI_test_CUDA.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 14%] Linking CXX executable Cabana_Halo_MPI_test_CUDA
Scanning dependencies of target Cabana_ParticleList_test_CUDA
[ 15%] Building CXX object core/unit_test/CMakeFiles/Cabana_ParticleList_test_CUDA.dir/CUDA/tstParticleList_CUDA.cpp.o
[ 15%] Built target Cabana_LinkedCellList_test_SERIAL
[ 15%] Building CXX object core/unit_test/CMakeFiles/Cabana_ParticleList_test_CUDA.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
Scanning dependencies of target Cabana_Sort_test_CUDA
[ 16%] Building CXX object core/unit_test/CMakeFiles/Cabana_Sort_test_CUDA.dir/CUDA/tstSort_CUDA.cpp.o
[ 17%] Building CXX object core/unit_test/CMakeFiles/Cabana_ParameterPack_test_SERIAL.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
[ 18%] Building CXX object core/unit_test/CMakeFiles/Cabana_NeighborListArborX_test_SERIAL.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
[ 18%] Built target Cabana_Halo_MPI_test_CUDA
Scanning dependencies of target Cabana_Slice_test_CUDA_UVM
[ 18%] Building CXX object core/unit_test/CMakeFiles/Cabana_Slice_test_CUDA_UVM.dir/CUDA_UVM/tstSlice_CUDA_UVM.cpp.o
[ 18%] Linking CXX executable Cabana_ParameterPack_test_SERIAL
[ 18%] Linking CXX executable Cabana_NeighborListArborX_test_SERIAL
 ---> Removed intermediate container b24937d0afa1
 ---> eb006625fab6
Step 16/18 : ARG KOKKOS_OPTIONS="-DKokkos_ENABLE_SYCL=ON -DCMAKE_CXX_FLAGS=-Wno-unknown-cuda-version -DKokkos_ENABLE_UNSUPPORTED_ARCHS=ON -DKokkos_ARCH_VOLTA70=ON -DCMAKE_CXX_STANDARD=17"
 ---> Running in 8fee9f29bea6
[ 18%] Built target Cabana_NeighborListArborX_test_SERIAL
[ 18%] Built target Cabana_ParameterPack_test_SERIAL
[ 18%] Building CXX object core/unit_test/CMakeFiles/Cabana_Sort_test_CUDA.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
Scanning dependencies of target Cabana_NeighborListArborX_test_CUDA_UVM
[ 18%] Building CXX object core/unit_test/CMakeFiles/Cabana_NeighborListArborX_test_CUDA_UVM.dir/CUDA_UVM/tstNeighborListArborX_CUDA_UVM.cpp.o
Scanning dependencies of target Cabana_Tuple_test_CUDA
[ 18%] Building CXX object core/unit_test/CMakeFiles/Cabana_Tuple_test_CUDA.dir/CUDA/tstTuple_CUDA.cpp.o
[ 18%] Building CXX object core/unit_test/CMakeFiles/Cabana_Parallel_test_SERIAL.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
[ 18%] Linking CXX executable Cabana_Parallel_test_SERIAL
[ 18%] Building CXX object core/unit_test/CMakeFiles/Cabana_DeepCopy_test_CUDA.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
[ 18%] Linking CXX executable Cabana_DeepCopy_test_CUDA
[ 18%] Built target Cabana_Parallel_test_SERIAL
[ 18%] Linking CXX executable Cabana_ParticleList_test_CUDA
Scanning dependencies of target Cabana_ParticleList_test_SERIAL
[ 18%] Building CXX object core/unit_test/CMakeFiles/Cabana_ParticleList_test_SERIAL.dir/SERIAL/tstParticleList_SERIAL.cpp.o
[ 18%] Building CXX object core/unit_test/CMakeFiles/Cabana_ParticleInit_test_CUDA.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
[ 18%] Built target Cabana_DeepCopy_test_CUDA
Scanning dependencies of target Cabana_ParticleList_test_CUDA_UVM
[ 18%] Linking CXX executable Cabana_ParticleInit_test_CUDA
[ 18%] Built target Cabana_ParticleList_test_CUDA
[ 18%] Building CXX object core/unit_test/CMakeFiles/Cabana_ParticleList_test_CUDA_UVM.dir/CUDA_UVM/tstParticleList_CUDA_UVM.cpp.o
Scanning dependencies of target Cabana_NeighborListArborX_test_CUDA
[ 18%] Building CXX object core/unit_test/CMakeFiles/Cabana_NeighborListArborX_test_CUDA.dir/CUDA/tstNeighborListArborX_CUDA.cpp.o
[ 18%] Building CXX object core/unit_test/CMakeFiles/Cabana_Tuple_test_CUDA.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
[ 19%] Linking CXX executable Cabana_Tuple_test_CUDA
[ 19%] Built target Cabana_ParticleInit_test_CUDA
Scanning dependencies of target Cabana_Halo_MPI_test_SERIAL
[ 19%] Building CXX object core/unit_test/CMakeFiles/Cabana_Halo_MPI_test_SERIAL.dir/SERIAL/tstHalo_SERIAL.cpp.o
[ 19%] Building CXX object core/unit_test/CMakeFiles/Cabana_Slice_test_CUDA_UVM.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
[ 19%] Linking CXX executable Cabana_Slice_test_CUDA_UVM
[ 19%] Built target Cabana_Tuple_test_CUDA
Scanning dependencies of target Cabana_Sort_test_SERIAL
[ 19%] Building CXX object core/unit_test/CMakeFiles/Cabana_Sort_test_SERIAL.dir/SERIAL/tstSort_SERIAL.cpp.o
[ 19%] Built target Cabana_Slice_test_CUDA_UVM
[ 19%] Building CXX object core/unit_test/CMakeFiles/Cabana_Sort_test_SERIAL.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
[ 19%] Building CXX object core/unit_test/CMakeFiles/Cabana_NeighborListArborX_test_CUDA_UVM.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
[ 19%] Building CXX object core/unit_test/CMakeFiles/Cabana_ParticleList_test_SERIAL.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
[ 20%] Building CXX object core/unit_test/CMakeFiles/Cabana_Halo_MPI_test_SERIAL.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
Scanning dependencies of target Cabana_AoSoA_test_CUDA_UVM
[ 21%] Building CXX object core/unit_test/CMakeFiles/Cabana_AoSoA_test_CUDA_UVM.dir/CUDA_UVM/tstAoSoA_CUDA_UVM.cpp.o
[ 21%] Linking CXX executable Cabana_Sort_test_CUDA
[ 21%] Built target Cabana_Sort_test_CUDA
Scanning dependencies of target Cabana_NeighborList_test_CUDA_UVM
[ 21%] Building CXX object core/unit_test/CMakeFiles/Cabana_NeighborList_test_CUDA_UVM.dir/CUDA_UVM/tstNeighborList_CUDA_UVM.cpp.o
 ---> Removed intermediate container 8fee9f29bea6
 ---> f14902bf9c95
Step 17/18 : ENV KOKKOS_DIR=/opt/kokkos
 ---> Running in 5a9f571a808d
[ 21%] Linking CXX executable Cabana_ParticleList_test_SERIAL
[ 21%] Built target Cabana_ParticleList_test_SERIAL
Scanning dependencies of target Cabana_Distributor_MPI_test_SERIAL
[ 21%] Building CXX object core/unit_test/CMakeFiles/Cabana_Distributor_MPI_test_SERIAL.dir/SERIAL/tstDistributor_SERIAL.cpp.o
[ 21%] Building CXX object core/unit_test/CMakeFiles/Cabana_ParticleList_test_CUDA_UVM.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
[ 22%] Linking CXX executable Cabana_ParticleList_test_CUDA_UVM
[ 22%] Linking CXX executable Cabana_Sort_test_SERIAL
[ 22%] Built target Cabana_ParticleList_test_CUDA_UVM
Scanning dependencies of target Cabana_Parallel_test_CUDA_UVM
[ 22%] Building CXX object core/unit_test/CMakeFiles/Cabana_Parallel_test_CUDA_UVM.dir/CUDA_UVM/tstParallel_CUDA_UVM.cpp.o
[ 22%] Linking CXX executable Cabana_Halo_MPI_test_SERIAL
[ 22%] Built target Cabana_Sort_test_SERIAL
Scanning dependencies of target Cabana_Sort_test_CUDA_UVM
[ 22%] Building CXX object core/unit_test/CMakeFiles/Cabana_Sort_test_CUDA_UVM.dir/CUDA_UVM/tstSort_CUDA_UVM.cpp.o
[ 22%] Built target Cabana_Halo_MPI_test_SERIAL
Scanning dependencies of target Cabana_Tuple_test_CUDA_UVM
[ 22%] Building CXX object core/unit_test/CMakeFiles/Cabana_Tuple_test_CUDA_UVM.dir/CUDA_UVM/tstTuple_CUDA_UVM.cpp.o
[ 22%] Building CXX object core/unit_test/CMakeFiles/Cabana_AoSoA_test_CUDA_UVM.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
[ 22%] Linking CXX executable Cabana_AoSoA_test_CUDA_UVM
[ 22%] Built target Cabana_AoSoA_test_CUDA_UVM
Scanning dependencies of target Cabana_ParameterPack_test_CUDA_UVM
[ 22%] Building CXX object core/unit_test/CMakeFiles/Cabana_ParameterPack_test_CUDA_UVM.dir/CUDA_UVM/tstParameterPack_CUDA_UVM.cpp.o
 ---> Removed intermediate container 5a9f571a808d
 ---> 99f833633ed4
Step 18/18 : RUN . /opt/intel/oneapi/setvars.sh --include-intel-llvm &&     KOKKOS_URL=https://github.com/kokkos/kokkos/archive/${KOKKOS_VERSION}.tar.gz &&     KOKKOS_ARCHIVE=kokkos-${KOKKOS_HASH}.tar.gz &&     SCRATCH_DIR=/scratch && mkdir -p ${SCRATCH_DIR} && cd ${SCRATCH_DIR} &&     wget --quiet ${KOKKOS_URL} --output-document=${KOKKOS_ARCHIVE} &&     mkdir -p kokkos &&     tar -xf ${KOKKOS_ARCHIVE} -C kokkos --strip-components=1 &&     cd kokkos &&     mkdir -p build && cd build &&     cmake         -D CMAKE_BUILD_TYPE=Release         -D CMAKE_INSTALL_PREFIX=${KOKKOS_DIR}         -D CMAKE_CXX_COMPILER=${DPCPP}         ${KOKKOS_OPTIONS}         .. &&     make -j${NPROCS} install &&     rm -rf ${SCRATCH_DIR}
 ---> Running in ede8752558ff
 
:: initializing oneAPI environment ...
   dash: SH_VERSION = unknown
   args: Using "$@" for setvars.sh arguments: 
:: compiler -- latest
:: debugger -- latest
:: dev-utilities -- latest
:: dpl -- latest
:: tbb -- latest
:: oneAPI environment initialized ::
 
[ 22%] Building CXX object core/unit_test/CMakeFiles/Cabana_Distributor_MPI_test_SERIAL.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 22%] Linking CXX executable Cabana_Distributor_MPI_test_SERIAL
-- Setting default Kokkos CXX standard to 17
[ 22%] Building CXX object core/unit_test/CMakeFiles/Cabana_Parallel_test_CUDA_UVM.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
[ 22%] Built target Cabana_Distributor_MPI_test_SERIAL
[ 22%] Building CXX object core/unit_test/CMakeFiles/Cabana_ParameterPack_test_CUDA_UVM.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
[ 23%] Linking CXX executable Cabana_Parallel_test_CUDA_UVM
[ 23%] Building CXX object core/unit_test/CMakeFiles/Cabana_NeighborListArborX_test_CUDA.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
-- The CXX compiler identification is IntelLLVM 2023.0.0
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Check for working CXX compiler: /opt/intel/oneapi/compiler/2023.0.0/linux/bin-llvm/clang++ - skipped
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Kokkos version: 4.1.00
-- The project name is: Kokkos
-- Using internal gtest for testing
-- SERIAL backend is being turned on to ensure there is at least one Host space. To change this, you must enable another host execution space and configure with -DKokkos_ENABLE_SERIAL=OFF or change CMakeCache.txt
-- Using -std=gnu++17 for C++17 extensions as feature
[ 24%] Building CXX object core/unit_test/CMakeFiles/Cabana_NeighborList_test_CUDA_UVM.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
-- Performing Test KOKKOS_IMPL_SYCL_DEVICE_GLOBAL_SUPPORTED
Scanning dependencies of target Cabana_CommunicationPlan_MPI_test_SERIAL
[ 24%] Building CXX object core/unit_test/CMakeFiles/Cabana_CommunicationPlan_MPI_test_SERIAL.dir/SERIAL/tstCommunicationPlan_SERIAL.cpp.o
[ 24%] Built target Cabana_Parallel_test_CUDA_UVM
Scanning dependencies of target Grid_FastFourierTransform_MPI_test_CUDA_UVM
[ 24%] Building CXX object core/unit_test/CMakeFiles/Cabana_Tuple_test_CUDA_UVM.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
[ 24%] Building CXX object grid/unit_test/CMakeFiles/Grid_FastFourierTransform_MPI_test_CUDA_UVM.dir/CUDA_UVM/tstFastFourierTransform_CUDA_UVM.cpp.o
[ 24%] Linking CXX executable Cabana_Tuple_test_CUDA_UVM
[ 24%] Built target Cabana_Tuple_test_CUDA_UVM
[ 24%] Linking CXX executable Cabana_ParameterPack_test_CUDA_UVM
Scanning dependencies of target Grid_HypreSemiStructuredSolverMulti_MPI_test_CUDA_UVM
[ 25%] Building CXX object grid/unit_test/CMakeFiles/Grid_HypreSemiStructuredSolverMulti_MPI_test_CUDA_UVM.dir/CUDA_UVM/tstHypreSemiStructuredSolverMulti_CUDA_UVM.cpp.o
-- Performing Test KOKKOS_IMPL_SYCL_DEVICE_GLOBAL_SUPPORTED - Success
-- Built-in Execution Spaces:
--     Device Parallel: Kokkos::Experimental::SYCL
--     Host Parallel: NoTypeDefined
--       Host Serial: SERIAL
-- 
-- Architectures:
--  VOLTA70
-- Found TPLLIBDL: /usr/include  
-- Looking for C++ include oneapi/dpl/execution
[ 25%] Built target Cabana_ParameterPack_test_CUDA_UVM
Scanning dependencies of target Grid_HypreStructuredSolver3d_MPI_test_CUDA_UVM
[ 26%] Building CXX object grid/unit_test/CMakeFiles/Grid_HypreStructuredSolver3d_MPI_test_CUDA_UVM.dir/CUDA_UVM/tstHypreStructuredSolver3d_CUDA_UVM.cpp.o
[ 26%] Linking CXX executable Cabana_NeighborListArborX_test_CUDA_UVM
[ 26%] Building CXX object core/unit_test/CMakeFiles/Cabana_Sort_test_CUDA_UVM.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
[ 26%] Built target Cabana_NeighborListArborX_test_CUDA_UVM
[ 27%] Linking CXX executable Cabana_Sort_test_CUDA_UVM
Scanning dependencies of target Grid_SparseLocalGrid_MPI_test_CUDA_UVM
[ 27%] Building CXX object grid/unit_test/CMakeFiles/Grid_SparseLocalGrid_MPI_test_CUDA_UVM.dir/CUDA_UVM/tstSparseLocalGrid_CUDA_UVM.cpp.o
[ 27%] Built target Cabana_Sort_test_CUDA_UVM
Scanning dependencies of target Grid_GlobalGrid_MPI_test_CUDA
[ 28%] Building CXX object grid/unit_test/CMakeFiles/Grid_GlobalGrid_MPI_test_CUDA.dir/CUDA/tstGlobalGrid_CUDA.cpp.o
-- Looking for C++ include oneapi/dpl/execution - found
-- Looking for C++ include oneapi/dpl/algorithm
[ 28%] Building CXX object grid/unit_test/CMakeFiles/Grid_HypreSemiStructuredSolverMulti_MPI_test_CUDA_UVM.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 29%] Building CXX object core/unit_test/CMakeFiles/Cabana_CommunicationPlan_MPI_test_SERIAL.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 29%] Linking CXX executable Cabana_CommunicationPlan_MPI_test_SERIAL
[ 29%] Building CXX object grid/unit_test/CMakeFiles/Grid_HypreStructuredSolver3d_MPI_test_CUDA_UVM.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 29%] Built target Cabana_CommunicationPlan_MPI_test_SERIAL
[ 29%] Building CXX object grid/unit_test/CMakeFiles/Grid_GlobalGrid_MPI_test_CUDA.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
-- Looking for C++ include oneapi/dpl/algorithm - found
-- Performing Test KOKKOS_NO_TBB_CONFLICT
-- Performing Test KOKKOS_NO_TBB_CONFLICT - Failed
-- Using internal desul_atomics copy
-- Kokkos Devices: SERIAL;SYCL, Kokkos Backends: SERIAL;SYCL
-- Configuring done (27.8s)
-- Generating done (0.6s)
-- Build files have been written to: /scratch/kokkos/build
[  3%] Building CXX object core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_CPUDiscovery.cpp.o
[  6%] Building CXX object simd/src/CMakeFiles/kokkossimd.dir/Kokkos_SIMD_dummy.cpp.o
[ 10%] Building CXX object core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_Command_Line_Parsing.cpp.o
[ 10%] Built target AlwaysCheckGit
[ 13%] Linking CXX static library libkokkossimd.a
[ 17%] Building CXX object CMakeFiles/impl_git_version.dir/generated/Kokkos_Version_Info.cpp.o
[ 17%] Built target kokkossimd
[ 20%] Building CXX object core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_Core.cpp.o
[ 24%] Linking CXX static library libimpl_git_version.a
[ 24%] Built target impl_git_version
[ 27%] Building CXX object core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_Error.cpp.o
[ 31%] Building CXX object core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_ExecPolicy.cpp.o
[ 30%] Linking CXX executable Cabana_NeighborListArborX_test_CUDA
[ 30%] Linking CXX executable Cabana_NeighborList_test_CUDA_UVM
[ 31%] Building CXX object grid/unit_test/CMakeFiles/Grid_FastFourierTransform_MPI_test_CUDA_UVM.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 31%] Building CXX object grid/unit_test/CMakeFiles/Grid_SparseLocalGrid_MPI_test_CUDA_UVM.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 31%] Built target Cabana_NeighborList_test_CUDA_UVM
Scanning dependencies of target Grid_SparseHalo_MPI_test_SERIAL
[ 31%] Built target Cabana_NeighborListArborX_test_CUDA
[ 31%] Building CXX object grid/unit_test/CMakeFiles/Grid_SparseHalo_MPI_test_SERIAL.dir/SERIAL/tstSparseHalo_SERIAL.cpp.o
Scanning dependencies of target Grid_SparseDimPartitioner_MPI_test_SERIAL
[ 31%] Building CXX object grid/unit_test/CMakeFiles/Grid_SparseDimPartitioner_MPI_test_SERIAL.dir/SERIAL/tstSparseDimPartitioner_SERIAL.cpp.o
[ 32%] Building CXX object grid/unit_test/CMakeFiles/Grid_SparseDimPartitioner_MPI_test_SERIAL.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 32%] Linking CXX executable Grid_HypreSemiStructuredSolverMulti_MPI_test_CUDA_UVM
[ 34%] Building CXX object core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_HostBarrier.cpp.o
[ 32%] Linking CXX executable Grid_HypreStructuredSolver3d_MPI_test_CUDA_UVM
[ 32%] Building CXX object grid/unit_test/CMakeFiles/Grid_SparseHalo_MPI_test_SERIAL.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
Scanning dependencies of target Grid_IndexConversion_MPI_test_CUDA
[ 32%] Building CXX object grid/unit_test/CMakeFiles/Grid_IndexConversion_MPI_test_CUDA.dir/CUDA/tstIndexConversion_CUDA.cpp.o
[ 32%] Built target Grid_HypreSemiStructuredSolverMulti_MPI_test_CUDA_UVM
[ 32%] Building CXX object grid/unit_test/CMakeFiles/Grid_IndexConversion_MPI_test_CUDA.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 32%] Linking CXX executable Grid_GlobalGrid_MPI_test_CUDA
[ 32%] Built target Grid_HypreStructuredSolver3d_MPI_test_CUDA_UVM
Scanning dependencies of target Grid_FastFourierTransform_MPI_test_SERIAL
[ 32%] Building CXX object grid/unit_test/CMakeFiles/Grid_FastFourierTransform_MPI_test_SERIAL.dir/SERIAL/tstFastFourierTransform_SERIAL.cpp.o
Scanning dependencies of target Grid_Interpolation3d_MPI_test_SERIAL
[ 32%] Building CXX object grid/unit_test/CMakeFiles/Grid_Interpolation3d_MPI_test_SERIAL.dir/SERIAL/tstInterpolation3d_SERIAL.cpp.o
[ 32%] Linking CXX executable Grid_FastFourierTransform_MPI_test_CUDA_UVM
[ 32%] Built target Grid_GlobalGrid_MPI_test_CUDA
Scanning dependencies of target Grid_HypreSemiStructuredSolver_MPI_test_SERIAL
[ 32%] Building CXX object grid/unit_test/CMakeFiles/Grid_HypreSemiStructuredSolver_MPI_test_SERIAL.dir/SERIAL/tstHypreSemiStructuredSolver_SERIAL.cpp.o
[ 32%] Built target Grid_FastFourierTransform_MPI_test_CUDA_UVM
[ 32%] Building CXX object grid/unit_test/CMakeFiles/Grid_FastFourierTransform_MPI_test_SERIAL.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 32%] Linking CXX executable Grid_SparseLocalGrid_MPI_test_CUDA_UVM
[ 32%] Building CXX object grid/unit_test/CMakeFiles/Grid_HypreSemiStructuredSolver_MPI_test_SERIAL.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 37%] Building CXX object core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_HostSpace.cpp.o
Scanning dependencies of target Grid_SplineEvaluation2d_MPI_test_SERIAL
[ 32%] Building CXX object grid/unit_test/CMakeFiles/Grid_SplineEvaluation2d_MPI_test_SERIAL.dir/SERIAL/tstSplineEvaluation2d_SERIAL.cpp.o
[ 41%] Building CXX object core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_HostSpace_deepcopy.cpp.o
[ 32%] Built target Grid_SparseLocalGrid_MPI_test_CUDA_UVM
Scanning dependencies of target Grid_Splines_MPI_test_CUDA_UVM
[ 32%] Building CXX object grid/unit_test/CMakeFiles/Grid_Splines_MPI_test_CUDA_UVM.dir/CUDA_UVM/tstSplines_CUDA_UVM.cpp.o
[ 44%] Building CXX object core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_HostThreadTeam.cpp.o
[ 32%] Linking CXX executable Grid_SparseDimPartitioner_MPI_test_SERIAL
[ 48%] Building CXX object core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_MemoryPool.cpp.o
/var/jenkins/workspace/Cabana_PR-743/grid/unit_test/tstSparseHalo.hpp(601): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::__shared_count") from a __host__ __device__ function("") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/unit_test/tstSparseHalo.hpp(601): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/src/Cabana_Grid_SparseIndexSpace.hpp(1009): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("Cabana::Grid::Experimental::SparseArray< ::Cabana::MemberTypes<double [3], float > ,  ::Kokkos::HostSpace,  ::Cabana::Grid::Node,  ::Cabana::Grid::SparseMesh<float, (unsigned long)3ul> ,  ::Cabana::Grid::SparseMap< ::Kokkos::Serial, (unsigned long long)4ull, ( ::Cabana::Grid::HashTypes)0, unsigned long, unsigned long> > ::~SparseArray") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/src/Cabana_Grid_SparseIndexSpace.hpp(1009): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("Cabana::Grid::Experimental::SparseArray< ::Cabana::MemberTypes<double [3], float > ,  ::Kokkos::HostSpace,  ::Cabana::Grid::Node,  ::Cabana::Grid::SparseMesh<float, (unsigned long)3ul> ,  ::Cabana::Grid::SparseMap< ::Kokkos::Serial, (unsigned long long)4ull, ( ::Cabana::Grid::HashTypes)0, unsigned long, unsigned long> > ::~SparseArray [subobject]") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/src/Cabana_Grid_SparseHalo.hpp(922): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::__shared_count") from a __host__ __device__ function("") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/src/Cabana_Grid_SparseHalo.hpp(922): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/src/Cabana_Grid_SparseHalo.hpp(1184): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::__shared_count") from a __host__ __device__ function("") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/src/Cabana_Grid_SparseHalo.hpp(1184): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/src/Cabana_Grid_SparseHalo.hpp(1184): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::__shared_count") from a __host__ __device__ function("") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/src/Cabana_Grid_SparseHalo.hpp(1184): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("") is not allowed

[ 33%] Linking CXX executable Grid_SparseHalo_MPI_test_SERIAL
[ 33%] Built target Grid_SparseDimPartitioner_MPI_test_SERIAL
[ 33%] Building CXX object grid/unit_test/CMakeFiles/Grid_SplineEvaluation2d_MPI_test_SERIAL.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 33%] Building CXX object grid/unit_test/CMakeFiles/Grid_Interpolation3d_MPI_test_SERIAL.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
Scanning dependencies of target Grid_IndexConversion_MPI_test_SERIAL
[ 33%] Building CXX object grid/unit_test/CMakeFiles/Grid_IndexConversion_MPI_test_SERIAL.dir/SERIAL/tstIndexConversion_SERIAL.cpp.o
[ 33%] Linking CXX executable Grid_HypreSemiStructuredSolver_MPI_test_SERIAL
[ 33%] Built target Grid_HypreSemiStructuredSolver_MPI_test_SERIAL
[ 33%] Built target Grid_SparseHalo_MPI_test_SERIAL
Scanning dependencies of target Grid_LocalMesh3d_MPI_test_SERIAL
[ 33%] Building CXX object grid/unit_test/CMakeFiles/Grid_LocalMesh3d_MPI_test_SERIAL.dir/SERIAL/tstLocalMesh3d_SERIAL.cpp.o
Scanning dependencies of target Grid_GlobalMesh_MPI_test_CUDA_UVM
[ 33%] Building CXX object grid/unit_test/CMakeFiles/Grid_GlobalMesh_MPI_test_CUDA_UVM.dir/CUDA_UVM/tstGlobalMesh_CUDA_UVM.cpp.o
[ 51%] Building CXX object core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_MemorySpace.cpp.o
[ 33%] Building CXX object grid/unit_test/CMakeFiles/Grid_Splines_MPI_test_CUDA_UVM.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
[ 55%] Building CXX object core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_Profiling.cpp.o
[ 58%] Building CXX object core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_SharedAlloc.cpp.o
[ 34%] Linking CXX executable Grid_SplineEvaluation2d_MPI_test_SERIAL
[ 34%] Linking CXX executable Grid_IndexConversion_MPI_test_CUDA
[ 34%] Built target Grid_SplineEvaluation2d_MPI_test_SERIAL
Scanning dependencies of target Grid_ParticleGridDistributor2d_MPI_test_CUDA
[ 34%] Building CXX object grid/unit_test/CMakeFiles/Grid_ParticleGridDistributor2d_MPI_test_CUDA.dir/CUDA/tstParticleGridDistributor2d_CUDA.cpp.o
[ 34%] Linking CXX executable Grid_Interpolation3d_MPI_test_SERIAL
[ 34%] Built target Grid_IndexConversion_MPI_test_CUDA
Scanning dependencies of target Grid_Parallel_MPI_test_CUDA
[ 35%] Building CXX object grid/unit_test/CMakeFiles/Grid_Parallel_MPI_test_CUDA.dir/CUDA/tstParallel_CUDA.cpp.o
[ 35%] Linking CXX executable Grid_FastFourierTransform_MPI_test_SERIAL
[ 35%] Building CXX object grid/unit_test/CMakeFiles/Grid_GlobalMesh_MPI_test_CUDA_UVM.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
[ 62%] Building CXX object core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_Spinwait.cpp.o
[ 65%] Building CXX object core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_Stacktrace.cpp.o
[ 35%] Built target Grid_Interpolation3d_MPI_test_SERIAL
[ 35%] Built target Grid_FastFourierTransform_MPI_test_SERIAL
[ 35%] Building CXX object grid/unit_test/CMakeFiles/Grid_ParticleGridDistributor2d_MPI_test_CUDA.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
Scanning dependencies of target Grid_ParticleGridDistributor3d_MPI_test_SERIAL
Scanning dependencies of target Grid_SparseLocalGrid_MPI_test_CUDA
[ 35%] Building CXX object grid/unit_test/CMakeFiles/Grid_ParticleGridDistributor3d_MPI_test_SERIAL.dir/SERIAL/tstParticleGridDistributor3d_SERIAL.cpp.o
[ 35%] Building CXX object grid/unit_test/CMakeFiles/Grid_SparseLocalGrid_MPI_test_CUDA.dir/CUDA/tstSparseLocalGrid_CUDA.cpp.o
[ 35%] Linking CXX executable Grid_Splines_MPI_test_CUDA_UVM
[ 68%] Building CXX object core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_hwloc.cpp.o
[ 72%] Building CXX object core/src/CMakeFiles/kokkoscore.dir/Serial/Kokkos_Serial.cpp.o
[ 35%] Built target Grid_Splines_MPI_test_CUDA_UVM
[ 35%] Building CXX object grid/unit_test/CMakeFiles/Grid_LocalMesh3d_MPI_test_SERIAL.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 35%] Building CXX object grid/unit_test/CMakeFiles/Grid_IndexConversion_MPI_test_SERIAL.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
Scanning dependencies of target Grid_Array3d_MPI_test_CUDA
[ 35%] Building CXX object grid/unit_test/CMakeFiles/Grid_Array3d_MPI_test_CUDA.dir/CUDA/tstArray3d_CUDA.cpp.o
[ 35%] Linking CXX executable Grid_LocalMesh3d_MPI_test_SERIAL
[ 35%] Linking CXX executable Grid_IndexConversion_MPI_test_SERIAL
[ 75%] Building CXX object core/src/CMakeFiles/kokkoscore.dir/Serial/Kokkos_Serial_Task.cpp.o
[ 35%] Built target Grid_LocalMesh3d_MPI_test_SERIAL
[ 35%] Built target Grid_IndexConversion_MPI_test_SERIAL
[ 35%] Building CXX object grid/unit_test/CMakeFiles/Grid_ParticleGridDistributor3d_MPI_test_SERIAL.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 35%] Linking CXX executable Grid_GlobalMesh_MPI_test_CUDA_UVM
Scanning dependencies of target Grid_LocalMesh2d_MPI_test_CUDA_UVM
[ 35%] Building CXX object grid/unit_test/CMakeFiles/Grid_LocalMesh2d_MPI_test_CUDA_UVM.dir/CUDA_UVM/tstLocalMesh2d_CUDA_UVM.cpp.o
[ 35%] Building CXX object grid/unit_test/CMakeFiles/Grid_Parallel_MPI_test_CUDA.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 35%] Building CXX object grid/unit_test/CMakeFiles/Grid_LocalMesh2d_MPI_test_CUDA_UVM.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
Scanning dependencies of target Grid_SplineEvaluation3d_MPI_test_SERIAL
[ 35%] Building CXX object grid/unit_test/CMakeFiles/Grid_SplineEvaluation3d_MPI_test_SERIAL.dir/SERIAL/tstSplineEvaluation3d_SERIAL.cpp.o
[ 79%] Building CXX object core/src/CMakeFiles/kokkoscore.dir/SYCL/Kokkos_SYCL.cpp.o
[ 35%] Built target Grid_GlobalMesh_MPI_test_CUDA_UVM
Scanning dependencies of target Grid_GlobalMesh_MPI_test_CUDA
[ 35%] Building CXX object grid/unit_test/CMakeFiles/Grid_GlobalMesh_MPI_test_CUDA.dir/CUDA/tstGlobalMesh_CUDA.cpp.o
[ 36%] Building CXX object grid/unit_test/CMakeFiles/Grid_SparseLocalGrid_MPI_test_CUDA.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 36%] Linking CXX executable Grid_SparseLocalGrid_MPI_test_CUDA
[ 82%] Building CXX object core/src/CMakeFiles/kokkoscore.dir/SYCL/Kokkos_SYCL_Instance.cpp.o
[ 36%] Linking CXX executable Grid_ParticleGridDistributor3d_MPI_test_SERIAL
[ 37%] Linking CXX executable Grid_ParticleGridDistributor2d_MPI_test_CUDA
[ 37%] Linking CXX executable Grid_Parallel_MPI_test_CUDA
[ 37%] Built target Grid_SparseLocalGrid_MPI_test_CUDA
Scanning dependencies of target Grid_ParticleGridDistributor2d_MPI_test_SERIAL
[ 37%] Building CXX object grid/unit_test/CMakeFiles/Grid_ParticleGridDistributor2d_MPI_test_SERIAL.dir/SERIAL/tstParticleGridDistributor2d_SERIAL.cpp.o
[ 37%] Built target Grid_ParticleGridDistributor3d_MPI_test_SERIAL
Scanning dependencies of target Grid_LocalMesh2d_MPI_test_CUDA
[ 37%] Building CXX object grid/unit_test/CMakeFiles/Grid_LocalMesh2d_MPI_test_CUDA.dir/CUDA/tstLocalMesh2d_CUDA.cpp.o
[ 37%] Built target Grid_Parallel_MPI_test_CUDA
[ 37%] Built target Grid_ParticleGridDistributor2d_MPI_test_CUDA
[ 38%] Building CXX object grid/unit_test/CMakeFiles/Grid_ParticleGridDistributor2d_MPI_test_SERIAL.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
Scanning dependencies of target Grid_SparseIndexSpace_MPI_test_CUDA
[ 38%] Building CXX object grid/unit_test/CMakeFiles/Grid_SparseIndexSpace_MPI_test_CUDA.dir/CUDA/tstSparseIndexSpace_CUDA.cpp.o
Scanning dependencies of target Grid_Interpolation2d_MPI_test_SERIAL
[ 38%] Building CXX object grid/unit_test/CMakeFiles/Grid_Interpolation2d_MPI_test_SERIAL.dir/SERIAL/tstInterpolation2d_SERIAL.cpp.o
[ 86%] Building CXX object core/src/CMakeFiles/kokkoscore.dir/SYCL/Kokkos_SYCL_Space.cpp.o
[ 38%] Building CXX object grid/unit_test/CMakeFiles/Grid_GlobalMesh_MPI_test_CUDA.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
[ 89%] Building CXX object core/src/CMakeFiles/kokkoscore.dir/__/__/tpls/desul/src/Lock_Array_SYCL.cpp.o
[ 39%] Linking CXX executable Grid_GlobalMesh_MPI_test_CUDA
[ 39%] Building CXX object grid/unit_test/CMakeFiles/Grid_SplineEvaluation3d_MPI_test_SERIAL.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 39%] Linking CXX executable Grid_LocalMesh2d_MPI_test_CUDA_UVM
[ 39%] Linking CXX executable Grid_SplineEvaluation3d_MPI_test_SERIAL
[ 39%] Built target Grid_GlobalMesh_MPI_test_CUDA
Scanning dependencies of target Grid_HypreStructuredSolver2d_MPI_test_SERIAL
[ 39%] Building CXX object grid/unit_test/CMakeFiles/Grid_HypreStructuredSolver2d_MPI_test_SERIAL.dir/SERIAL/tstHypreStructuredSolver2d_SERIAL.cpp.o
[ 39%] Built target Grid_LocalMesh2d_MPI_test_CUDA_UVM
[ 39%] Built target Grid_SplineEvaluation3d_MPI_test_SERIAL
Scanning dependencies of target Grid_Parallel_MPI_test_CUDA_UVM
Scanning dependencies of target Grid_SparseIndexSpace_MPI_test_SERIAL
[ 39%] Building CXX object grid/unit_test/CMakeFiles/Grid_Parallel_MPI_test_CUDA_UVM.dir/CUDA_UVM/tstParallel_CUDA_UVM.cpp.o
[ 39%] Building CXX object grid/unit_test/CMakeFiles/Grid_SparseIndexSpace_MPI_test_SERIAL.dir/SERIAL/tstSparseIndexSpace_SERIAL.cpp.o
[ 40%] Building CXX object grid/unit_test/CMakeFiles/Grid_Array3d_MPI_test_CUDA.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 40%] Linking CXX executable Grid_Array3d_MPI_test_CUDA
[ 40%] Built target Grid_Array3d_MPI_test_CUDA
Scanning dependencies of target Grid_GlobalMesh_MPI_test_SERIAL
[ 40%] Building CXX object grid/unit_test/CMakeFiles/Grid_GlobalMesh_MPI_test_SERIAL.dir/SERIAL/tstGlobalMesh_SERIAL.cpp.o
[ 40%] Linking CXX executable Grid_ParticleGridDistributor2d_MPI_test_SERIAL
[ 40%] Building CXX object grid/unit_test/CMakeFiles/Grid_LocalMesh2d_MPI_test_CUDA.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 41%] Linking CXX executable Grid_LocalMesh2d_MPI_test_CUDA
[ 41%] Built target Grid_ParticleGridDistributor2d_MPI_test_SERIAL
[ 93%] Linking CXX static library libkokkoscore.a
[ 93%] Built target kokkoscore
[ 96%] Building CXX object containers/src/CMakeFiles/kokkoscontainers.dir/impl/Kokkos_UnorderedMap_impl.cpp.o
Scanning dependencies of target Grid_Halo3d_MPI_test_SERIAL
[ 41%] Building CXX object grid/unit_test/CMakeFiles/Grid_Halo3d_MPI_test_SERIAL.dir/SERIAL/tstHalo3d_SERIAL.cpp.o
[ 41%] Built target Grid_LocalMesh2d_MPI_test_CUDA
[ 42%] Building CXX object grid/unit_test/CMakeFiles/Grid_GlobalMesh_MPI_test_SERIAL.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
[ 42%] Building CXX object grid/unit_test/CMakeFiles/Grid_Interpolation2d_MPI_test_SERIAL.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 43%] Linking CXX executable Grid_Interpolation2d_MPI_test_SERIAL
[ 44%] Building CXX object grid/unit_test/CMakeFiles/Grid_HypreStructuredSolver2d_MPI_test_SERIAL.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 44%] Building CXX object grid/unit_test/CMakeFiles/Grid_Parallel_MPI_test_CUDA_UVM.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 44%] Built target Grid_Interpolation2d_MPI_test_SERIAL
Scanning dependencies of target Grid_Array3d_MPI_test_SERIAL
[ 44%] Building CXX object grid/unit_test/CMakeFiles/Grid_SparseIndexSpace_MPI_test_CUDA.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
[ 45%] Building CXX object grid/unit_test/CMakeFiles/Grid_Array3d_MPI_test_SERIAL.dir/SERIAL/tstArray3d_SERIAL.cpp.o
[ 45%] Linking CXX executable Grid_HypreStructuredSolver2d_MPI_test_SERIAL
Scanning dependencies of target Grid_ParticleInit_MPI_test_CUDA
[ 46%] Building CXX object grid/unit_test/CMakeFiles/Grid_ParticleInit_MPI_test_CUDA.dir/CUDA/tstParticleInit_CUDA.cpp.o
[ 46%] Linking CXX executable Grid_GlobalMesh_MPI_test_SERIAL
[ 46%] Built target Grid_HypreStructuredSolver2d_MPI_test_SERIAL
Scanning dependencies of target Grid_ParticleList_MPI_test_SERIAL
[ 46%] Building CXX object grid/unit_test/CMakeFiles/Grid_ParticleList_MPI_test_SERIAL.dir/SERIAL/tstParticleList_SERIAL.cpp.o
[ 46%] Linking CXX executable Grid_SparseIndexSpace_MPI_test_CUDA
[ 46%] Built target Grid_GlobalMesh_MPI_test_SERIAL
Scanning dependencies of target Grid_Splines_MPI_test_SERIAL
[ 47%] Building CXX object grid/unit_test/CMakeFiles/Grid_Splines_MPI_test_SERIAL.dir/SERIAL/tstSplines_SERIAL.cpp.o
[100%] Linking CXX static library libkokkoscontainers.a
[100%] Built target kokkoscontainers
Install the project...
-- Install configuration: "Release"
-- Installing: /opt/kokkos/include
-- Installing: /opt/kokkos/include/Kokkos_Crs.hpp
-- Installing: /opt/kokkos/include/Kokkos_HostSpace.hpp
-- Installing: /opt/kokkos/include/Kokkos_AnonymousSpace.hpp
-- Installing: /opt/kokkos/include/Kokkos_Complex.hpp
-- Installing: /opt/kokkos/include/Kokkos_Extents.hpp
-- Installing: /opt/kokkos/include/Kokkos_Rank.hpp
-- Installing: /opt/kokkos/include/KokkosExp_MDRangePolicy.hpp
-- Installing: /opt/kokkos/include/Kokkos_Half.hpp
-- Installing: /opt/kokkos/include/Kokkos_UniqueToken.hpp
-- Installing: /opt/kokkos/include/Kokkos_Atomics_Desul_Config.hpp
-- Installing: /opt/kokkos/include/Kokkos_LogicalSpaces.hpp
-- Installing: /opt/kokkos/include/Kokkos_Atomic.hpp
-- Installing: /opt/kokkos/include/Kokkos_GraphNode.hpp
-- Installing: /opt/kokkos/include/Kokkos_BitManipulation.hpp
-- Installing: /opt/kokkos/include/Kokkos_View.hpp
-- Installing: /opt/kokkos/include/Kokkos_TaskScheduler.hpp
-- Installing: /opt/kokkos/include/Kokkos_ScratchSpace.hpp
-- Installing: /opt/kokkos/include/Serial
-- Installing: /opt/kokkos/include/Serial/Kokkos_Serial_Task.hpp
-- Installing: /opt/kokkos/include/Serial/Kokkos_Serial_Parallel_Range.hpp
-- Installing: /opt/kokkos/include/Serial/Kokkos_Serial.hpp
-- Installing: /opt/kokkos/include/Serial/Kokkos_Serial_UniqueToken.hpp
-- Installing: /opt/kokkos/include/Serial/Kokkos_Serial_ZeroMemset.hpp
-- Installing: /opt/kokkos/include/Serial/Kokkos_Serial_Parallel_MDRange.hpp
-- Installing: /opt/kokkos/include/Serial/Kokkos_Serial_WorkGraphPolicy.hpp
-- Installing: /opt/kokkos/include/Serial/Kokkos_Serial_MDRangePolicy.hpp
-- Installing: /opt/kokkos/include/Serial/Kokkos_Serial_Parallel_Team.hpp
-- Installing: /opt/kokkos/include/Kokkos_ReductionIdentity.hpp
-- Installing: /opt/kokkos/include/Kokkos_Atomics_Desul_Volatile_Wrapper.hpp
-- Installing: /opt/kokkos/include/Kokkos_MathematicalConstants.hpp
-- Installing: /opt/kokkos/include/Kokkos_Timer.hpp
-- Installing: /opt/kokkos/include/Kokkos_Concepts.hpp
-- Installing: /opt/kokkos/include/Kokkos_MemoryTraits.hpp
-- Installing: /opt/kokkos/include/Kokkos_Profiling_ScopedRegion.hpp
-- Installing: /opt/kokkos/include/Threads
-- Installing: /opt/kokkos/include/Threads/Kokkos_Threads_Parallel_MDRange.hpp
-- Installing: /opt/kokkos/include/Threads/Kokkos_Threads.hpp
-- Installing: /opt/kokkos/include/Threads/Kokkos_Threads_Parallel_Range.hpp
-- Installing: /opt/kokkos/include/Threads/Kokkos_ThreadsExec.hpp
-- Installing: /opt/kokkos/include/Threads/Kokkos_ThreadsTeam.hpp
-- Installing: /opt/kokkos/include/Threads/Kokkos_Threads_WorkGraphPolicy.hpp
-- Installing: /opt/kokkos/include/Threads/Kokkos_Threads_UniqueToken.hpp
-- Installing: /opt/kokkos/include/Threads/Kokkos_Threads_MDRangePolicy.hpp
-- Installing: /opt/kokkos/include/Threads/Kokkos_Threads_Parallel_Team.hpp
-- Installing: /opt/kokkos/include/Kokkos_Parallel.hpp
-- Installing: /opt/kokkos/include/Kokkos_Atomics_Desul_Wrapper.hpp
-- Installing: /opt/kokkos/include/Kokkos_DetectionIdiom.hpp
-- Installing: /opt/kokkos/include/Kokkos_PointerOwnership.hpp
-- Installing: /opt/kokkos/include/Kokkos_Profiling_ProfileSection.hpp
-- Installing: /opt/kokkos/include/HPX
-- Installing: /opt/kokkos/include/HPX/Kokkos_HPX_Task.hpp
-- Installing: /opt/kokkos/include/HPX/Kokkos_HPX_MDRangePolicy.hpp
-- Installing: /opt/kokkos/include/HPX/Kokkos_HPX.hpp
-- Installing: /opt/kokkos/include/HPX/Kokkos_HPX_WorkGraphPolicy.hpp
-- Installing: /opt/kokkos/include/Kokkos_CopyViews.hpp
-- Installing: /opt/kokkos/include/traits
-- Installing: /opt/kokkos/include/traits/Kokkos_OccupancyControlTrait.hpp
-- Installing: /opt/kokkos/include/traits/Kokkos_IndexTypeTrait.hpp
-- Installing: /opt/kokkos/include/traits/Kokkos_IterationPatternTrait.hpp
-- Installing: /opt/kokkos/include/traits/Kokkos_Traits_fwd.hpp
-- Installing: /opt/kokkos/include/traits/Kokkos_WorkTagTrait.hpp
-- Installing: /opt/kokkos/include/traits/Kokkos_PolicyTraitAdaptor.hpp
-- Installing: /opt/kokkos/include/traits/Kokkos_LaunchBoundsTrait.hpp
-- Installing: /opt/kokkos/include/traits/Kokkos_ScheduleTrait.hpp
-- Installing: /opt/kokkos/include/traits/Kokkos_GraphKernelTrait.hpp
-- Installing: /opt/kokkos/include/traits/Kokkos_PolicyTraitMatcher.hpp
-- Installing: /opt/kokkos/include/traits/Kokkos_WorkItemPropertyTrait.hpp
-- Installing: /opt/kokkos/include/traits/Kokkos_ExecutionSpaceTrait.hpp
-- Installing: /opt/kokkos/include/KokkosExp_InterOp.hpp
-- Installing: /opt/kokkos/include/Kokkos_hwloc.hpp
-- Installing: /opt/kokkos/include/Kokkos_Macros.hpp
-- Installing: /opt/kokkos/include/fwd
-- Installing: /opt/kokkos/include/fwd/Kokkos_Fwd_OPENMP.hpp
-- Installing: /opt/kokkos/include/fwd/Kokkos_Fwd_OPENACC.hpp
-- Installing: /opt/kokkos/include/fwd/Kokkos_Fwd_OPENMPTARGET.hpp
-- Installing: /opt/kokkos/include/fwd/Kokkos_Fwd_HBWSpace.hpp
-- Installing: /opt/kokkos/include/fwd/Kokkos_Fwd_SYCL.hpp
-- Installing: /opt/kokkos/include/fwd/Kokkos_Fwd_THREADS.hpp
-- Installing: /opt/kokkos/include/fwd/Kokkos_Fwd_SERIAL.hpp
-- Installing: /opt/kokkos/include/fwd/Kokkos_Fwd_HPX.hpp
-- Installing: /opt/kokkos/include/fwd/Kokkos_Fwd_HIP.hpp
-- Installing: /opt/kokkos/include/fwd/Kokkos_Fwd_CUDA.hpp
-- Installing: /opt/kokkos/include/Kokkos_MathematicalFunctions.hpp
-- Installing: /opt/kokkos/include/OpenMPTarget
-- Installing: /opt/kokkos/include/OpenMPTarget/Kokkos_OpenMPTarget_Parallel_Common.hpp
-- Installing: /opt/kokkos/include/OpenMPTarget/Kokkos_OpenMPTarget_Parallel.hpp
-- Installing: /opt/kokkos/include/OpenMPTarget/Kokkos_OpenMPTarget_Reducer.hpp
-- Installing: /opt/kokkos/include/OpenMPTarget/Kokkos_OpenMPTarget_UniqueToken.hpp
-- Installing: /opt/kokkos/include/OpenMPTarget/Kokkos_OpenMPTarget_ParallelScan_Team.hpp
-- Installing: /opt/kokkos/include/OpenMPTarget/Kokkos_OpenMPTarget_MDRangePolicy.hpp
-- Installing: /opt/kokkos/include/OpenMPTarget/Kokkos_OpenMPTarget.hpp
-- Installing: /opt/kokkos/include/OpenMPTarget/Kokkos_OpenMPTarget_Parallel_MDRange.hpp
-- Installing: /opt/kokkos/include/OpenMPTarget/Kokkos_OpenMPTargetSpace.hpp
-- Installing: /opt/kokkos/include/OpenMPTarget/Kokkos_OpenMPTarget_ParallelScan_Range.hpp
-- Installing: /opt/kokkos/include/OpenMPTarget/Kokkos_OpenMPTarget_Task.hpp
-- Installing: /opt/kokkos/include/OpenMPTarget/Kokkos_OpenMPTarget_ParallelReduce_Range.hpp
-- Installing: /opt/kokkos/include/OpenMPTarget/Kokkos_OpenMPTarget_ParallelReduce_Team.hpp
-- Installing: /opt/kokkos/include/OpenMPTarget/Kokkos_OpenMPTarget_ParallelFor_Team.hpp
-- Installing: /opt/kokkos/include/OpenMPTarget/Kokkos_OpenMPTarget_Error.hpp
-- Installing: /opt/kokkos/include/OpenMPTarget/Kokkos_OpenMPTarget_Instance.hpp
-- Installing: /opt/kokkos/include/OpenMPTarget/Kokkos_OpenMPTarget_ParallelFor_Range.hpp
-- Installing: /opt/kokkos/include/OpenMPTarget/Kokkos_OpenMPTarget_Abort.hpp
-- Installing: /opt/kokkos/include/Kokkos_Parallel_Reduce.hpp
-- Installing: /opt/kokkos/include/Kokkos_Array.hpp
-- Installing: /opt/kokkos/include/Kokkos_MathematicalSpecialFunctions.hpp
-- Installing: /opt/kokkos/include/Kokkos_Graph_fwd.hpp
-- Installing: /opt/kokkos/include/OpenACC
-- Installing: /opt/kokkos/include/OpenACC/Kokkos_OpenACC_SharedAllocationRecord.hpp
-- Installing: /opt/kokkos/include/OpenACC/Kokkos_OpenACC_ParallelReduce_Range.hpp
-- Installing: /opt/kokkos/include/OpenACC/Kokkos_OpenACC_DeepCopy.hpp
-- Installing: /opt/kokkos/include/OpenACC/Kokkos_OpenACC_Traits.hpp
-- Installing: /opt/kokkos/include/OpenACC/Kokkos_OpenACC_Team.hpp
-- Installing: /opt/kokkos/include/OpenACC/Kokkos_OpenACC_ParallelReduce_Team.hpp
-- Installing: /opt/kokkos/include/OpenACC/Kokkos_OpenACC_MDRangePolicy.hpp
-- Installing: /opt/kokkos/include/OpenACC/Kokkos_OpenACC_ParallelFor_Range.hpp
-- Installing: /opt/kokkos/include/OpenACC/Kokkos_OpenACC_ParallelFor_MDRange.hpp
-- Installing: /opt/kokkos/include/OpenACC/Kokkos_OpenACC_Macros.hpp
-- Installing: /opt/kokkos/include/OpenACC/Kokkos_OpenACC_ParallelFor_Team.hpp
-- Installing: /opt/kokkos/include/OpenACC/Kokkos_OpenACC_Instance.hpp
-- Installing: /opt/kokkos/include/OpenACC/Kokkos_OpenACC_ParallelScan_Range.hpp
-- Installing: /opt/kokkos/include/OpenACC/Kokkos_OpenACC.hpp
-- Installing: /opt/kokkos/include/OpenACC/Kokkos_OpenACC_ScheduleType.hpp
-- Installing: /opt/kokkos/include/OpenACC/Kokkos_OpenACC_FunctorAdapter.hpp
-- Installing: /opt/kokkos/include/OpenACC/Kokkos_OpenACCSpace.hpp
-- Installing: /opt/kokkos/include/OpenACC/Kokkos_OpenACC_ParallelReduce_MDRange.hpp
-- Installing: /opt/kokkos/include/decl
-- Installing: /opt/kokkos/include/decl/Kokkos_Declare_SYCL.hpp
-- Installing: /opt/kokkos/include/decl/Kokkos_Declare_OPENMP.hpp
-- Installing: /opt/kokkos/include/decl/Kokkos_Declare_HBWSpace.hpp
-- Installing: /opt/kokkos/include/decl/Kokkos_Declare_HIP.hpp
-- Installing: /opt/kokkos/include/decl/Kokkos_Declare_OPENACC.hpp
-- Installing: /opt/kokkos/include/decl/Kokkos_Declare_SERIAL.hpp
-- Installing: /opt/kokkos/include/decl/Kokkos_Declare_OPENMPTARGET.hpp
-- Installing: /opt/kokkos/include/decl/Kokkos_Declare_THREADS.hpp
-- Installing: /opt/kokkos/include/decl/Kokkos_Declare_CUDA.hpp
-- Installing: /opt/kokkos/include/decl/Kokkos_Declare_HPX.hpp
-- Installing: /opt/kokkos/include/Kokkos_MemoryPool.hpp
-- Installing: /opt/kokkos/include/setup
-- Installing: /opt/kokkos/include/setup/Kokkos_Setup_Cuda.hpp
-- Installing: /opt/kokkos/include/setup/Kokkos_Setup_SYCL.hpp
-- Installing: /opt/kokkos/include/setup/Kokkos_Setup_HIP.hpp
-- Installing: /opt/kokkos/include/Kokkos_WorkGraphPolicy.hpp
-- Installing: /opt/kokkos/include/Kokkos_MinMaxClamp.hpp
-- Installing: /opt/kokkos/include/Kokkos_Tuners.hpp
-- Installing: /opt/kokkos/include/Kokkos_HBWSpace.hpp
-- Installing: /opt/kokkos/include/Kokkos_NumericTraits.hpp
-- Installing: /opt/kokkos/include/Kokkos_MasterLock.hpp
-- Installing: /opt/kokkos/include/Kokkos_AcquireUniqueTokenImpl.hpp
-- Installing: /opt/kokkos/include/SYCL
-- Installing: /opt/kokkos/include/SYCL/Kokkos_SYCL_Instance.hpp
-- Installing: /opt/kokkos/include/SYCL/Kokkos_SYCL_MDRangePolicy.hpp
-- Installing: /opt/kokkos/include/SYCL/Kokkos_SYCL_Half_Conversion.hpp
-- Installing: /opt/kokkos/include/SYCL/Kokkos_SYCL_Parallel_Team.hpp
-- Installing: /opt/kokkos/include/SYCL/Kokkos_SYCL.hpp
-- Installing: /opt/kokkos/include/SYCL/Kokkos_SYCL_Space.hpp
-- Installing: /opt/kokkos/include/SYCL/Kokkos_SYCL_ZeroMemset.hpp
-- Installing: /opt/kokkos/include/SYCL/Kokkos_SYCL_Team.hpp
-- Installing: /opt/kokkos/include/SYCL/Kokkos_SYCL_UniqueToken.hpp
-- Installing: /opt/kokkos/include/SYCL/Kokkos_SYCL_Parallel_Range.hpp
-- Installing: /opt/kokkos/include/SYCL/Kokkos_SYCL_Parallel_Reduce.hpp
-- Installing: /opt/kokkos/include/SYCL/Kokkos_SYCL_DeepCopy.hpp
-- Installing: /opt/kokkos/include/SYCL/Kokkos_SYCL_Abort.hpp
-- Installing: /opt/kokkos/include/SYCL/Kokkos_SYCL_Half_Impl_Type.hpp
-- Installing: /opt/kokkos/include/SYCL/Kokkos_SYCL_Parallel_Scan.hpp
-- Installing: /opt/kokkos/include/Kokkos_Vectorization.hpp
-- Installing: /opt/kokkos/include/Kokkos_Future.hpp
-- Installing: /opt/kokkos/include/HIP
-- Installing: /opt/kokkos/include/HIP/Kokkos_HIP_KernelLaunch.hpp
-- Installing: /opt/kokkos/include/HIP/Kokkos_HIP_Parallel_Team.hpp
-- Installing: /opt/kokkos/include/HIP/Kokkos_HIP_Space.hpp
-- Installing: /opt/kokkos/include/HIP/Kokkos_HIP_ReduceScan.hpp
-- Installing: /opt/kokkos/include/HIP/Kokkos_HIP_Error.hpp
-- Installing: /opt/kokkos/include/HIP/Kokkos_HIP_Instance.hpp
-- Installing: /opt/kokkos/include/HIP/Kokkos_HIP_Team.hpp
-- Installing: /opt/kokkos/include/HIP/Kokkos_HIP_WorkGraphPolicy.hpp
-- Installing: /opt/kokkos/include/HIP/Kokkos_HIP_Abort.hpp
-- Installing: /opt/kokkos/include/HIP/Kokkos_HIP_UniqueToken.hpp
-- Installing: /opt/kokkos/include/HIP/Kokkos_HIP_Vectorization.hpp
-- Installing: /opt/kokkos/include/HIP/Kokkos_HIP_ZeroMemset.hpp
-- Installing: /opt/kokkos/include/HIP/Kokkos_HIP_Half_Conversion.hpp
-- Installing: /opt/kokkos/include/HIP/Kokkos_HIP_Shuffle_Reduce.hpp
-- Installing: /opt/kokkos/include/HIP/Kokkos_HIP_Half_Impl_Type.hpp
-- Installing: /opt/kokkos/include/HIP/Kokkos_HIP_Parallel_Range.hpp
-- Installing: /opt/kokkos/include/HIP/Kokkos_HIP_Parallel_MDRange.hpp
-- Installing: /opt/kokkos/include/HIP/Kokkos_HIP.hpp
-- Installing: /opt/kokkos/include/HIP/Kokkos_HIP_BlockSize_Deduction.hpp
-- Installing: /opt/kokkos/include/HIP/Kokkos_HIP_DeepCopy.hpp
-- Installing: /opt/kokkos/include/HIP/Kokkos_HIP_SharedAllocationRecord.hpp
-- Installing: /opt/kokkos/include/HIP/Kokkos_HIP_MDRangePolicy.hpp
-- Installing: /opt/kokkos/include/impl
-- Installing: /opt/kokkos/include/impl/Kokkos_TaskPolicyData.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_AnalyzePolicy.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_GraphImpl_fwd.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_Volatile_Load.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_MemorySpace.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_TaskQueueCommon.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_Default_GraphNode_Impl.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_DeviceManagement.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_BitOps.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_ParseCommandLineArgumentsAndEnvironmentVariables.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_CPUDiscovery.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_StringManipulation.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_SharedAlloc_timpl.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_HostThreadTeam.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_HostSharedPtr.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_Profiling_DeviceInfo.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_SimpleTaskScheduler.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_HostSpace_deepcopy.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_Profiling_Interface.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_HostSpace_ZeroMemset.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_TaskNode.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_GraphImpl.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_ViewUniformType.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_TaskQueue_impl.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_QuadPrecisionMath.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_ChaseLev.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_GraphNodeCustomization.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_TaskQueueMemoryManager.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_Command_Line_Parsing.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_TaskQueue.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_Half_FloatingPointWrapper.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_Default_Graph_fwd.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_TaskQueueMultiple_impl.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_Spinwait.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_Traits.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_ViewCtor.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_VLAEmulation.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_FixedBufferMemoryPool.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_Profiling.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_ViewTracker.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_Memory_Fence.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_HostBarrier.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_EBO.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_Atomic_View.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_InitializationSettings.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_LinkedListNode.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_TaskResult.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_Default_GraphNodeKernel.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_TaskBase.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_ZeroMemset_fwd.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_SingleTaskQueue.hpp
-- Installing: /opt/kokkos/include/impl/KokkosExp_ViewMapping.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_SharedAlloc.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_Error.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_ClockTic.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_FunctorAnalysis.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_Stacktrace.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_ViewMapping.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_TeamMDPolicy.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_ViewArray.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_LIFO.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_Profiling_C_Interface.h
-- Installing: /opt/kokkos/include/impl/Kokkos_Half_NumericTraits.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_GraphNodeImpl.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_ExecSpaceManager.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_TaskQueueMultiple.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_Tools_Generic.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_GraphImpl_Utilities.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_Default_Graph_Impl.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_MultipleTaskQueue.hpp
-- Installing: /opt/kokkos/include/impl/KokkosExp_IterateTileGPU.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_ConcurrentBitset.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_NvidiaGpuArchitectures.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_ViewLayoutTiled.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_OptionalRef.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_TaskTeamMember.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_Combined_Reducer.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_Utilities.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_Tools.hpp
-- Installing: /opt/kokkos/include/impl/KokkosExp_Host_IterateTile.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_MemoryPoolAllocator.hpp
-- Installing: /opt/kokkos/include/Kokkos_TaskScheduler_fwd.hpp
-- Installing: /opt/kokkos/include/OpenMP
-- Installing: /opt/kokkos/include/OpenMP/Kokkos_OpenMP_Team.hpp
-- Installing: /opt/kokkos/include/OpenMP/Kokkos_OpenMP_UniqueToken.hpp
-- Installing: /opt/kokkos/include/OpenMP/Kokkos_OpenMP_MDRangePolicy.hpp
-- Installing: /opt/kokkos/include/OpenMP/Kokkos_OpenMP.hpp
-- Installing: /opt/kokkos/include/OpenMP/Kokkos_OpenMP_Parallel.hpp
-- Installing: /opt/kokkos/include/OpenMP/Kokkos_OpenMP_Instance.hpp
-- Installing: /opt/kokkos/include/OpenMP/Kokkos_OpenMP_WorkGraphPolicy.hpp
-- Installing: /opt/kokkos/include/OpenMP/Kokkos_OpenMP_Task.hpp
-- Installing: /opt/kokkos/include/Cuda
-- Installing: /opt/kokkos/include/Cuda/Kokkos_Cuda_ZeroMemset.hpp
-- Installing: /opt/kokkos/include/Cuda/Kokkos_Cuda_Parallel_MDRange.hpp
-- Installing: /opt/kokkos/include/Cuda/Kokkos_Cuda_Graph_Impl.hpp
-- Installing: /opt/kokkos/include/Cuda/Kokkos_Cuda_Vectorization.hpp
-- Installing: /opt/kokkos/include/Cuda/Kokkos_CudaSpace.hpp
-- Installing: /opt/kokkos/include/Cuda/Kokkos_Cuda_GraphNodeKernel.hpp
-- Installing: /opt/kokkos/include/Cuda/Kokkos_Cuda_BlockSize_Deduction.hpp
-- Installing: /opt/kokkos/include/Cuda/Kokkos_Cuda_GraphNode_Impl.hpp
-- Installing: /opt/kokkos/include/Cuda/Kokkos_Cuda.hpp
-- Installing: /opt/kokkos/include/Cuda/Kokkos_Cuda_KernelLaunch.hpp
-- Installing: /opt/kokkos/include/Cuda/Kokkos_Cuda_UniqueToken.hpp
-- Installing: /opt/kokkos/include/Cuda/Kokkos_Cuda_abort.hpp
-- Installing: /opt/kokkos/include/Cuda/Kokkos_Cuda_WorkGraphPolicy.hpp
-- Installing: /opt/kokkos/include/Cuda/Kokkos_Cuda_Half_Conversion.hpp
-- Installing: /opt/kokkos/include/Cuda/Kokkos_Cuda_Parallel_Range.hpp
-- Installing: /opt/kokkos/include/Cuda/Kokkos_Cuda_Instance.hpp
-- Installing: /opt/kokkos/include/Cuda/Kokkos_Cuda_Error.hpp
-- Installing: /opt/kokkos/include/Cuda/Kokkos_Cuda_Task.hpp
-- Installing: /opt/kokkos/include/Cuda/Kokkos_Cuda_View.hpp
-- Installing: /opt/kokkos/include/Cuda/Kokkos_Cuda_ReduceScan.hpp
-- Installing: /opt/kokkos/include/Cuda/Kokkos_Cuda_Team.hpp
-- Installing: /opt/kokkos/include/Cuda/Kokkos_Cuda_Half_Impl_Type.hpp
-- Installing: /opt/kokkos/include/Cuda/Kokkos_Cuda_MDRangePolicy.hpp
-- Installing: /opt/kokkos/include/Cuda/Kokkos_Cuda_Parallel_Team.hpp
-- Installing: /opt/kokkos/include/Kokkos_Core_fwd.hpp
-- Installing: /opt/kokkos/include/Kokkos_Layout.hpp
-- Installing: /opt/kokkos/include/Kokkos_ExecPolicy.hpp
-- Installing: /opt/kokkos/include/Kokkos_Graph.hpp
-- Installing: /opt/kokkos/include/Kokkos_Pair.hpp
-- Installing: /opt/kokkos/include/Kokkos_Core.hpp
-- Installing: /opt/kokkos/include/View
-- Installing: /opt/kokkos/include/View/MDSpan
-- Installing: /opt/kokkos/include/View/MDSpan/Kokkos_MDSpan_Header.hpp
-- Installing: /opt/kokkos/include/View/MDSpan/Kokkos_MDSpan_Extents.hpp
-- Installing: /opt/kokkos/include/View/Hooks
-- Installing: /opt/kokkos/include/View/Hooks/Kokkos_ViewHooks.hpp
-- Installing: /opt/kokkos/include/desul
-- Installing: /opt/kokkos/include/desul/atomics
-- Installing: /opt/kokkos/include/desul/atomics/Compare_Exchange_SYCL.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Lock_Based_Fetch_Op_SYCL.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Lock_Based_Fetch_Op_Host.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Thread_Fence.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Adapt_GCC.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Thread_Fence_SYCL.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Thread_Fence_ScopeCaller.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Adapt_SYCL.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Atomic_Ref.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Compare_Exchange_OpenMP.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Fetch_Op_CUDA.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Adapt_CXX.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Fetch_Op.hpp
-- Installing: /opt/kokkos/include/desul/atomics/openmp
-- Installing: /opt/kokkos/include/desul/atomics/openmp/OpenMP_40.hpp
-- Installing: /opt/kokkos/include/desul/atomics/openmp/OpenMP_40_op.inc
-- Installing: /opt/kokkos/include/desul/atomics/Lock_Array_CUDA.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Fetch_Op_GCC.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Lock_Based_Fetch_Op.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Thread_Fence_GCC.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Generic.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Lock_Array_HIP.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Operator_Function_Objects.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Compare_Exchange_CUDA.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Compare_Exchange_GCC.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Lock_Based_Fetch_Op_CUDA.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Lock_Free_Fetch_Op.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Lock_Based_Fetch_Op_HIP.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Fetch_Op_Generic.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Thread_Fence_CUDA.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Common.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Compare_Exchange_ScopeCaller.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Compare_Exchange_MSVC.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Fetch_Op_ScopeCaller.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Lock_Array.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Fetch_Op_SYCL.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Fetch_Op_HIP.hpp
-- Installing: /opt/kokkos/include/desul/atomics/cuda
-- Installing: /opt/kokkos/include/desul/atomics/cuda/CUDA_asm.hpp
-- Installing: /opt/kokkos/include/desul/atomics/cuda/cuda_cc7_asm_atomic_op.inc_predicate
-- Installing: /opt/kokkos/include/desul/atomics/cuda/cuda_cc7_asm_exchange_op.inc
-- Installing: /opt/kokkos/include/desul/atomics/cuda/cuda_cc7_asm_atomic_op.inc_generic
-- Installing: /opt/kokkos/include/desul/atomics/cuda/CUDA_asm_exchange.hpp
-- Installing: /opt/kokkos/include/desul/atomics/cuda/cuda_cc7_asm_exchange_memorder.inc
-- Installing: /opt/kokkos/include/desul/atomics/cuda/cuda_cc7_asm_memorder.inc
-- Installing: /opt/kokkos/include/desul/atomics/cuda/cuda_cc7_asm_atomic_fetch_op.inc_generic
-- Installing: /opt/kokkos/include/desul/atomics/cuda/cuda_cc7_asm_atomic_fetch_op.inc_predicate
-- Installing: /opt/kokkos/include/desul/atomics/cuda/cuda_cc7_asm.inc
-- Installing: /opt/kokkos/include/desul/atomics/cuda/cuda_cc7_asm_atomic_op.inc
-- Installing: /opt/kokkos/include/desul/atomics/cuda/cuda_cc7_asm_atomic_fetch_op.inc_forceglobal
-- Installing: /opt/kokkos/include/desul/atomics/cuda/cuda_cc7_asm_atomic_op.inc_isglobal
-- Installing: /opt/kokkos/include/desul/atomics/cuda/cuda_cc7_asm_exchange.inc
-- Installing: /opt/kokkos/include/desul/atomics/cuda/cuda_cc7_asm_atomic_fetch_op.inc
-- Installing: /opt/kokkos/include/desul/atomics/cuda/cuda_cc7_asm_atomic_op.inc_forceglobal
-- Installing: /opt/kokkos/include/desul/atomics/cuda/cuda_cc7_asm_atomic_fetch_op.inc_isglobal
-- Installing: /opt/kokkos/include/desul/atomics/Lock_Array_SYCL.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Thread_Fence_HIP.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Thread_Fence_MSVC.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Macros.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Fetch_Op_OpenMP.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Compare_Exchange.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Thread_Fence_OpenMP.hpp
-- Installing: /opt/kokkos/include/desul/atomics/Compare_Exchange_HIP.hpp
-- Installing: /opt/kokkos/include/desul/atomics.hpp
-- Up-to-date: /opt/kokkos/include/desul
-- Up-to-date: /opt/kokkos/include/desul/atomics
-- Installing: /opt/kokkos/include/desul/atomics/Config.hpp
-- Installing: /opt/kokkos/lib/libkokkoscore.a
-- Up-to-date: /opt/kokkos/lib/libkokkoscore.a
-- Up-to-date: /opt/kokkos/include
-- Installing: /opt/kokkos/include/Kokkos_OffsetView.hpp
-- Installing: /opt/kokkos/include/Kokkos_Functional.hpp
-- Installing: /opt/kokkos/include/Kokkos_Vector.hpp
-- Installing: /opt/kokkos/include/Kokkos_Bitset.hpp
-- Installing: /opt/kokkos/include/Kokkos_ErrorReporter.hpp
-- Installing: /opt/kokkos/include/Kokkos_DualView.hpp
-- Installing: /opt/kokkos/include/Kokkos_UnorderedMap.hpp
-- Installing: /opt/kokkos/include/Kokkos_ScatterView.hpp
-- Installing: /opt/kokkos/include/Kokkos_DynamicView.hpp
-- Up-to-date: /opt/kokkos/include/impl
-- Installing: /opt/kokkos/include/impl/Kokkos_Functional_impl.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_UnorderedMap_impl.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_Bitset_impl.hpp
-- Installing: /opt/kokkos/include/impl/Kokkos_StaticCrsGraph_factory.hpp
-- Installing: /opt/kokkos/include/Kokkos_StaticCrsGraph.hpp
-- Installing: /opt/kokkos/include/Kokkos_DynRankView.hpp
-- Installing: /opt/kokkos/lib/libkokkoscontainers.a
-- Up-to-date: /opt/kokkos/lib/libkokkoscontainers.a
-- Up-to-date: /opt/kokkos/include
-- Installing: /opt/kokkos/include/Kokkos_Random.hpp
-- Installing: /opt/kokkos/include/Kokkos_StdAlgorithms.hpp
-- Installing: /opt/kokkos/include/std_algorithms
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_Count.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_ReverseCopy.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_Rotate.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_Fill.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_IsSorted.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_CountIf.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_PartitionCopy.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_RemoveCopy.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_CopyIf.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_MaxElement.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_Reduce.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_IsSortedUntil.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_ReplaceCopy.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_BeginEnd.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_TransformInclusiveScan.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_RotateCopy.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_Search.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_Distance.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_IterSwap.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_FillN.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_AdjacentFind.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_FindEnd.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_FindIfNot.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_UniqueCopy.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_Move.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_AnyOf.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_AdjacentDifference.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_InclusiveScan.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_MinElement.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_ShiftRight.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_Reverse.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_IsPartitioned.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_Transform.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_ShiftLeft.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_ForEach.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_FindFirstOf.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_Remove.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_SwapRanges.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_Equal.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_Find.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_GenerateN.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_PartitionPoint.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_CopyBackward.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_LexicographicalCompare.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_Unique.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_Generate.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_TransformReduce.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_Replace.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_TransformExclusiveScan.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_ReplaceCopyIf.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_Swap.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_RemoveCopyIf.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_SearchN.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_ReplaceIf.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_RemoveIf.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_MinMaxElement.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_Mismatch.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_ReverseCopy.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_ReducerWithArbitraryJoinerNoNeutralElement.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_Rotate.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_IsSorted.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_PartitionCopy.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_CopyIf.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_FindIfOrNot.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_IdentityReferenceUnaryFunctor.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_CopyCopyN.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_Reduce.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_HelperPredicates.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_IsSortedUntil.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_ReplaceCopy.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_TransformInclusiveScan.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_RotateCopy.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_Constraints.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_Search.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_ForEachForEachN.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_AdjacentFind.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_FindEnd.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_UniqueCopy.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_Move.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_AdjacentDifference.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_InclusiveScan.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_ShiftRight.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_MinMaxMinmaxElement.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_Reverse.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_IsPartitioned.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_Transform.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_ShiftLeft.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_FindFirstOf.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_SwapRanges.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_ValueWrapperForNoNeutralElement.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_Equal.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_GenerateGenerateN.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_PartitionPoint.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_CopyBackward.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_LexicographicalCompare.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_Unique.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_FillFillN.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_TransformReduce.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_Replace.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_TransformExclusiveScan.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_ReplaceCopyIf.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_RemoveAllVariants.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_CountCountIf.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_SearchN.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_ReplaceIf.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_AllOfAnyOfNoneOf.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_Mismatch.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_ExclusiveScan.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_MoveBackward.hpp
-- Installing: /opt/kokkos/include/std_algorithms/impl/Kokkos_RandomAccessIterator.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_AllOf.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_Copy.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_CopyN.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_ExclusiveScan.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_MoveBackward.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_FindIf.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_ForEachN.hpp
-- Installing: /opt/kokkos/include/std_algorithms/Kokkos_NoneOf.hpp
-- Installing: /opt/kokkos/include/Kokkos_Sort.hpp
-- Installing: /opt/kokkos/include/Kokkos_NestedSort.hpp
-- Up-to-date: /opt/kokkos/include
-- Installing: /opt/kokkos/include/Kokkos_SIMD_Common.hpp
-- Installing: /opt/kokkos/include/Kokkos_SIMD_NEON.hpp
-- Installing: /opt/kokkos/include/Kokkos_SIMD_AVX2.hpp
-- Installing: /opt/kokkos/include/Kokkos_SIMD.hpp
-- Installing: /opt/kokkos/include/Kokkos_SIMD_Scalar.hpp
-- Installing: /opt/kokkos/include/Kokkos_SIMD_AVX512.hpp
-- Installing: /opt/kokkos/lib/libkokkossimd.a
-- Up-to-date: /opt/kokkos/lib/libkokkossimd.a
-- Installing: /opt/kokkos/lib/cmake/Kokkos/KokkosConfig.cmake
-- Installing: /opt/kokkos/lib/cmake/Kokkos/KokkosConfigCommon.cmake
-- Installing: /opt/kokkos/lib/cmake/Kokkos/KokkosConfigVersion.cmake
-- Installing: /opt/kokkos/lib/cmake/Kokkos/KokkosTargets.cmake
-- Installing: /opt/kokkos/lib/cmake/Kokkos/KokkosTargets-release.cmake
-- Installing: /opt/kokkos/include/KokkosCore_config.h
-- Installing: /opt/kokkos/bin/nvcc_wrapper
-- Installing: /opt/kokkos/bin/hpcbind
-- Installing: /opt/kokkos/bin/kokkos_launch_compiler
-- Up-to-date: /opt/kokkos/include/KokkosCore_config.h
-- Installing: /opt/kokkos/include/KokkosCore_Config_FwdBackend.hpp
-- Installing: /opt/kokkos/include/KokkosCore_Config_SetupBackend.hpp
-- Installing: /opt/kokkos/include/KokkosCore_Config_DeclareBackend.hpp
-- Installing: /opt/kokkos/include/KokkosCore_Config_PostInclude.hpp
[ 47%] Built target Grid_SparseIndexSpace_MPI_test_CUDA
Scanning dependencies of target Grid_Partitioner_MPI_test_CUDA_UVM
[ 48%] Building CXX object grid/unit_test/CMakeFiles/Grid_Partitioner_MPI_test_CUDA_UVM.dir/CUDA_UVM/tstPartitioner_CUDA_UVM.cpp.o
[ 48%] Building CXX object grid/unit_test/CMakeFiles/Grid_SparseIndexSpace_MPI_test_SERIAL.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
[ 48%] Linking CXX executable Grid_SparseIndexSpace_MPI_test_SERIAL
[ 48%] Built target Grid_SparseIndexSpace_MPI_test_SERIAL
Scanning dependencies of target Grid_Array2d_MPI_test_SERIAL
[ 48%] Building CXX object grid/unit_test/CMakeFiles/Grid_Array2d_MPI_test_SERIAL.dir/SERIAL/tstArray2d_SERIAL.cpp.o
[ 48%] Linking CXX executable Grid_Parallel_MPI_test_CUDA_UVM
[ 48%] Built target Grid_Parallel_MPI_test_CUDA_UVM
[ 48%] Building CXX object grid/unit_test/CMakeFiles/Grid_Partitioner_MPI_test_CUDA_UVM.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 49%] Building CXX object grid/unit_test/CMakeFiles/Grid_Halo3d_MPI_test_SERIAL.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 49%] Building CXX object grid/unit_test/CMakeFiles/Grid_Array2d_MPI_test_SERIAL.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 49%] Linking CXX executable Grid_Halo3d_MPI_test_SERIAL
[ 49%] Building CXX object grid/unit_test/CMakeFiles/Grid_ParticleInit_MPI_test_CUDA.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
Scanning dependencies of target Grid_SparseHalo_MPI_test_CUDA_UVM
[ 50%] Building CXX object grid/unit_test/CMakeFiles/Grid_SparseHalo_MPI_test_CUDA_UVM.dir/CUDA_UVM/tstSparseHalo_CUDA_UVM.cpp.o
[ 50%] Building CXX object grid/unit_test/CMakeFiles/Grid_Array3d_MPI_test_SERIAL.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 50%] Linking CXX executable Grid_Array3d_MPI_test_SERIAL
[ 50%] Building CXX object grid/unit_test/CMakeFiles/Grid_Splines_MPI_test_SERIAL.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
[ 50%] Linking CXX executable Grid_Splines_MPI_test_SERIAL
[ 50%] Linking CXX executable Grid_Partitioner_MPI_test_CUDA_UVM
[ 50%] Built target Grid_Halo3d_MPI_test_SERIAL
Scanning dependencies of target Grid_LocalGrid_MPI_test_SERIAL
[ 51%] Building CXX object grid/unit_test/CMakeFiles/Grid_LocalGrid_MPI_test_SERIAL.dir/SERIAL/tstLocalGrid_SERIAL.cpp.o
[ 51%] Built target Grid_Array3d_MPI_test_SERIAL
Scanning dependencies of target Grid_IndexSpace_MPI_test_SERIAL
[ 52%] Building CXX object grid/unit_test/CMakeFiles/Grid_IndexSpace_MPI_test_SERIAL.dir/SERIAL/tstIndexSpace_SERIAL.cpp.o
[ 52%] Built target Grid_Splines_MPI_test_SERIAL
[ 52%] Built target Grid_Partitioner_MPI_test_CUDA_UVM
[ 52%] Building CXX object grid/unit_test/CMakeFiles/Grid_SparseHalo_MPI_test_CUDA_UVM.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
Scanning dependencies of target Grid_GlobalGrid_MPI_test_SERIAL
[ 53%] Building CXX object grid/unit_test/CMakeFiles/Grid_GlobalGrid_MPI_test_SERIAL.dir/SERIAL/tstGlobalGrid_SERIAL.cpp.o
[ 53%] Building CXX object grid/unit_test/CMakeFiles/Grid_GlobalGrid_MPI_test_SERIAL.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
Scanning dependencies of target Grid_HypreSemiStructuredSolverMulti_MPI_test_CUDA
[ 53%] Building CXX object grid/unit_test/CMakeFiles/Grid_HypreSemiStructuredSolverMulti_MPI_test_CUDA.dir/CUDA/tstHypreSemiStructuredSolverMulti_CUDA.cpp.o
[ 54%] Building CXX object grid/unit_test/CMakeFiles/Grid_ParticleList_MPI_test_SERIAL.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 54%] Linking CXX executable Grid_ParticleList_MPI_test_SERIAL
[ 54%] Linking CXX executable Grid_ParticleInit_MPI_test_CUDA
[ 54%] Built target Grid_ParticleInit_MPI_test_CUDA
[ 54%] Built target Grid_ParticleList_MPI_test_SERIAL
Scanning dependencies of target Grid_ParticleGridDistributor3d_MPI_test_CUDA_UVM
Scanning dependencies of target Grid_HypreSemiStructuredSolverMulti_MPI_test_SERIAL
[ 54%] Building CXX object grid/unit_test/CMakeFiles/Grid_ParticleGridDistributor3d_MPI_test_CUDA_UVM.dir/CUDA_UVM/tstParticleGridDistributor3d_CUDA_UVM.cpp.o
[ 54%] Building CXX object grid/unit_test/CMakeFiles/Grid_HypreSemiStructuredSolverMulti_MPI_test_SERIAL.dir/SERIAL/tstHypreSemiStructuredSolverMulti_SERIAL.cpp.o
[ 54%] Linking CXX executable Grid_Array2d_MPI_test_SERIAL
[ 54%] Built target Grid_Array2d_MPI_test_SERIAL
[ 54%] Building CXX object grid/unit_test/CMakeFiles/Grid_HypreSemiStructuredSolverMulti_MPI_test_CUDA.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
Scanning dependencies of target Grid_SparseArray_MPI_test_SERIAL
 ---> Removed intermediate container ede8752558ff
 ---> fb83b280eb11
Successfully built fb83b280eb11
Successfully tagged f1543b7005e5744b73e10444c0db9dc00eeab29c:latest
[Pipeline] isUnix
[Pipeline] withEnv
[Pipeline] {
[Pipeline] sh
[ 54%] Building CXX object grid/unit_test/CMakeFiles/Grid_SparseArray_MPI_test_SERIAL.dir/SERIAL/tstSparseArray_SERIAL.cpp.o
+ docker inspect -f . f1543b7005e5744b73e10444c0db9dc00eeab29c
.
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] withDockerContainer
fetnat04 seems to be running inside container 3b6db9f76a3764e23425e248da644f2a4c1c0ee72b5ac0d428184db7e80c1f8e
$ docker run -t -d -u 0:0 -v /tmp/ccache.kokkos:/tmp/ccache -w /var/jenkins/workspace/Cabana_PR-743 --volumes-from 3b6db9f76a3764e23425e248da644f2a4c1c0ee72b5ac0d428184db7e80c1f8e -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** f1543b7005e5744b73e10444c0db9dc00eeab29c cat
[ 54%] Linking CXX executable Grid_GlobalGrid_MPI_test_SERIAL
[ 54%] Building CXX object grid/unit_test/CMakeFiles/Grid_LocalGrid_MPI_test_SERIAL.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 54%] Linking CXX executable Grid_LocalGrid_MPI_test_SERIAL
$ docker top f324e08565c97b3d73dbd20ee34ab1968400b06524f911e22f6a6c6b4b1c44d3 -eo pid,comm
[Pipeline] {
[Pipeline] sh
+ ccache --zero-stats
Statistics zeroed
[Pipeline] sh
[ 54%] Building CXX object grid/unit_test/CMakeFiles/Grid_IndexSpace_MPI_test_SERIAL.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
[ 54%] Linking CXX executable Grid_IndexSpace_MPI_test_SERIAL
[ 54%] Built target Grid_GlobalGrid_MPI_test_SERIAL
+ . /opt/intel/oneapi/setvars.sh --include-intel-llvm
+ script_name=setvars.sh
+ config_file=
+ config_array=
+ component_array=
+ warning_tally=0
+ posix_nl=

+ save_args
+ echo  
+ script_args= 
+ _setvars_this_script_name=setvars.sh
+ _setvars_get_proc_name /var/jenkins/workspace/Cabana_PR-743@tmp/durable-c1f7d414/script.sh.copy
+ [ -n  ]
+ script=/var/jenkins/workspace/Cabana_PR-743@tmp/durable-c1f7d414/script.sh.copy
+ [ -L /var/jenkins/workspace/Cabana_PR-743@tmp/durable-c1f7d414/script.sh.copy ]
+ basename -- /var/jenkins/workspace/Cabana_PR-743@tmp/durable-c1f7d414/script.sh.copy
+ [ setvars.sh = script.sh.copy ]
+ sourcer=
+ sourced_nm=
+ ps -p 51 -o comm=
+ sourced_sh=sh
+ _setvars_get_proc_name /var/jenkins/workspace/Cabana_PR-743@tmp/durable-c1f7d414/script.sh.copy
+ [ -n  ]
+ script=/var/jenkins/workspace/Cabana_PR-743@tmp/durable-c1f7d414/script.sh.copy
+ [ -L /var/jenkins/workspace/Cabana_PR-743@tmp/durable-c1f7d414/script.sh.copy ]
+ basename -- /var/jenkins/workspace/Cabana_PR-743@tmp/durable-c1f7d414/script.sh.copy
+ proc_name=script.sh.copy
+ [ -n  ]
+ [ -n  ]
+ [ -n  ]
+ [ dash = sh ]
+ [ sh = sh ]
+ printf %s: %s script.sh.copy SH_VERSION = unknown
+ sourcer=script.sh.copy: SH_VERSION = unknown
+ sourced_nm=/var/jenkins/workspace/Cabana_PR-743@tmp/durable-c1f7d414/script.sh.copy: 370: /opt/intel/oneapi/setvars.sh: Bad substitution
+ :
+ printf %s /var/jenkins/workspace/Cabana_PR-743@tmp/durable-c1f7d414/script.sh.copy: 370: /opt/intel/oneapi/setvars.sh: Bad substitution
+ grep -Eq sh: [0-9]+: .*setvars\.sh: 
+ [  -eq 0 ]
/var/jenkins/workspace/Cabana_PR-743@tmp/durable-c1f7d414/script.sh.copy: 371: [: Illegal number: 
+ [  = /var/jenkins/workspace/Cabana_PR-743@tmp/durable-c1f7d414/script.sh.copy: 370: /opt/intel/oneapi/setvars.sh: Bad substitution ]
+ get_script_path /var/jenkins/workspace/Cabana_PR-743@tmp/durable-c1f7d414/script.sh.copy: 370: /opt/intel/oneapi/setvars.sh: Bad substitution
+ script=/var/jenkins/workspace/Cabana_PR-743@tmp/durable-c1f7d414/script.sh.copy: 370: /opt/intel/oneapi/setvars.sh: Bad substitution
+ [ -L /var/jenkins/workspace/Cabana_PR-743@tmp/durable-c1f7d414/script.sh.copy: 370: /opt/intel/oneapi/setvars.sh: Bad substitution ]
+ command dirname -- /var/jenkins/workspace/Cabana_PR-743@tmp/durable-c1f7d414/script.sh.copy: 370: /opt/intel/oneapi/setvars.sh: Bad substitution
+ script_dir=/var/jenkins/workspace/Cabana_PR-743@tmp/durable-c1f7d414/script.sh.copy: 370: /opt/intel/oneapi
+ [ -n  ]
+ command cd /var/jenkins/workspace/Cabana_PR-743@tmp/durable-c1f7d414/script.sh.copy: 370: /opt/intel/oneapi
/var/jenkins/workspace/Cabana_PR-743@tmp/durable-c1f7d414/script.sh.copy: 269: cd: can't cd to /var/jenkins/workspace/Cabana_PR-743@tmp/durable-c1f7d414/script.sh.copy: 370: /opt/intel/oneapi
+ script_dir=
+ script_root=
Post stage
[Pipeline] sh
[ 54%] Building CXX object grid/unit_test/CMakeFiles/Grid_SparseArray_MPI_test_SERIAL.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
+ ccache --show-stats
cache directory                     /tmp/ccache
primary config                      /tmp/ccache/ccache.conf
secondary config      (readonly)    /etc/ccache.conf
stats updated                       Thu Mar 21 19:59:40 2024
stats zeroed                        Thu Mar 21 19:59:40 2024
cache hit (direct)                     0
cache hit (preprocessed)               0
cache miss                             0
cache hit rate                      0.00 %
cleanups performed                     0
files in cache                      2514
cache size                           4.3 GB
max cache size                      10.0 GB
[Pipeline] }
$ docker stop --time=1 f324e08565c97b3d73dbd20ee34ab1968400b06524f911e22f6a6c6b4b1c44d3
Scanning dependencies of target Grid_HypreStructuredSolver3d_MPI_test_SERIAL
[ 54%] Building CXX object grid/unit_test/CMakeFiles/Grid_HypreStructuredSolver3d_MPI_test_SERIAL.dir/SERIAL/tstHypreStructuredSolver3d_SERIAL.cpp.o
$ docker rm -f --volumes f324e08565c97b3d73dbd20ee34ab1968400b06524f911e22f6a6c6b4b1c44d3
[ 54%] Building CXX object grid/unit_test/CMakeFiles/Grid_HypreSemiStructuredSolverMulti_MPI_test_SERIAL.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 54%] Built target Grid_LocalGrid_MPI_test_SERIAL
[Pipeline] // withDockerContainer
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
Failed in branch SYCL
Scanning dependencies of target Grid_GlobalParticleComm_MPI_test_SERIAL
[ 54%] Linking CXX executable Grid_HypreSemiStructuredSolverMulti_MPI_test_SERIAL
[ 54%] Building CXX object grid/unit_test/CMakeFiles/Grid_GlobalParticleComm_MPI_test_SERIAL.dir/SERIAL/tstGlobalParticleComm_SERIAL.cpp.o
[ 54%] Built target Grid_IndexSpace_MPI_test_SERIAL
Scanning dependencies of target Grid_LocalMesh2d_MPI_test_SERIAL
[ 54%] Building CXX object grid/unit_test/CMakeFiles/Grid_LocalMesh2d_MPI_test_SERIAL.dir/SERIAL/tstLocalMesh2d_SERIAL.cpp.o
[ 54%] Built target Grid_HypreSemiStructuredSolverMulti_MPI_test_SERIAL
Scanning dependencies of target Grid_SparseIndexSpace_MPI_test_CUDA_UVM
[ 54%] Building CXX object grid/unit_test/CMakeFiles/Grid_SparseIndexSpace_MPI_test_CUDA_UVM.dir/CUDA_UVM/tstSparseIndexSpace_CUDA_UVM.cpp.o
/var/jenkins/workspace/Cabana_PR-743/grid/unit_test/tstSparseHalo.hpp(601): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::__shared_count") from a __host__ __device__ function("") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/unit_test/tstSparseHalo.hpp(601): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/src/Cabana_Grid_SparseIndexSpace.hpp(1009): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("Cabana::Grid::Experimental::SparseArray< ::Cabana::MemberTypes<double [3], float > ,  ::Kokkos::CudaUVMSpace,  ::Cabana::Grid::Node,  ::Cabana::Grid::SparseMesh<float, (unsigned long)3ul> ,  ::Cabana::Grid::SparseMap< ::Kokkos::Cuda, (unsigned long long)4ull, ( ::Cabana::Grid::HashTypes)0, unsigned long, unsigned long> > ::~SparseArray") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/src/Cabana_Grid_SparseIndexSpace.hpp(1009): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("Cabana::Grid::Experimental::SparseArray< ::Cabana::MemberTypes<double [3], float > ,  ::Kokkos::CudaUVMSpace,  ::Cabana::Grid::Node,  ::Cabana::Grid::SparseMesh<float, (unsigned long)3ul> ,  ::Cabana::Grid::SparseMap< ::Kokkos::Cuda, (unsigned long long)4ull, ( ::Cabana::Grid::HashTypes)0, unsigned long, unsigned long> > ::~SparseArray [subobject]") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/src/Cabana_Grid_SparseHalo.hpp(922): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::__shared_count") from a __host__ __device__ function("") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/src/Cabana_Grid_SparseHalo.hpp(922): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/src/Cabana_Grid_SparseHalo.hpp(1184): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::__shared_count") from a __host__ __device__ function("") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/src/Cabana_Grid_SparseHalo.hpp(1184): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/src/Cabana_Grid_SparseHalo.hpp(1184): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::__shared_count") from a __host__ __device__ function("") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/src/Cabana_Grid_SparseHalo.hpp(1184): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("") is not allowed

[ 54%] Linking CXX executable Grid_SparseHalo_MPI_test_CUDA_UVM
[ 54%] Linking CXX executable Grid_HypreSemiStructuredSolverMulti_MPI_test_CUDA
[ 54%] Built target Grid_SparseHalo_MPI_test_CUDA_UVM
[ 55%] Building CXX object grid/unit_test/CMakeFiles/Grid_ParticleGridDistributor3d_MPI_test_CUDA_UVM.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
Scanning dependencies of target Grid_BovWriter_MPI_test_SERIAL
[ 55%] Linking CXX executable Grid_ParticleGridDistributor3d_MPI_test_CUDA_UVM
[ 55%] Building CXX object grid/unit_test/CMakeFiles/Grid_BovWriter_MPI_test_SERIAL.dir/SERIAL/tstBovWriter_SERIAL.cpp.o
[ 55%] Building CXX object grid/unit_test/CMakeFiles/Grid_HypreStructuredSolver3d_MPI_test_SERIAL.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 56%] Linking CXX executable Grid_HypreStructuredSolver3d_MPI_test_SERIAL
[ 56%] Built target Grid_HypreSemiStructuredSolverMulti_MPI_test_CUDA
[ 56%] Built target Grid_ParticleGridDistributor3d_MPI_test_CUDA_UVM
Scanning dependencies of target Grid_ParticleList_MPI_test_CUDA_UVM
Scanning dependencies of target Grid_Halo2d_MPI_test_SERIAL
[ 56%] Built target Grid_HypreStructuredSolver3d_MPI_test_SERIAL
[ 56%] Building CXX object grid/unit_test/CMakeFiles/Grid_ParticleList_MPI_test_CUDA_UVM.dir/CUDA_UVM/tstParticleList_CUDA_UVM.cpp.o
[ 56%] Building CXX object grid/unit_test/CMakeFiles/Grid_Halo2d_MPI_test_SERIAL.dir/SERIAL/tstHalo2d_SERIAL.cpp.o
Scanning dependencies of target Grid_SparseLocalGrid_MPI_test_SERIAL
[ 57%] Building CXX object grid/unit_test/CMakeFiles/Grid_SparseLocalGrid_MPI_test_SERIAL.dir/SERIAL/tstSparseLocalGrid_SERIAL.cpp.o
/var/jenkins/workspace/Cabana_PR-743/grid/unit_test/tstSparseArray.hpp(367): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::__shared_count") from a __host__ __device__ function("") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/unit_test/tstSparseArray.hpp(367): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/unit_test/tstSparseArray.hpp(367): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::__shared_count") from a __host__ __device__ function("") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/unit_test/tstSparseArray.hpp(367): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/unit_test/tstSparseArray.hpp(554): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::__shared_count") from a __host__ __device__ function("") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/unit_test/tstSparseArray.hpp(554): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/unit_test/tstSparseArray.hpp(554): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::__shared_count") from a __host__ __device__ function("") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/unit_test/tstSparseArray.hpp(554): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/src/Cabana_Grid_SparseIndexSpace.hpp(1009): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("Cabana::Grid::Experimental::SparseArray< ::Cabana::MemberTypes<int [3], float, double [3], int [2] > ,  ::Kokkos::HostSpace,  ::Cabana::Grid::Node,  ::Cabana::Grid::SparseMesh<float, (unsigned long)3ul> ,  ::Cabana::Grid::SparseMap< ::Kokkos::Serial, (unsigned long long)4ull, ( ::Cabana::Grid::HashTypes)0, unsigned long, unsigned long> > ::~SparseArray") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/src/Cabana_Grid_SparseIndexSpace.hpp(1009): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("Cabana::Grid::Experimental::SparseArray< ::Cabana::MemberTypes<int [3], float, double [3], int [2] > ,  ::Kokkos::HostSpace,  ::Cabana::Grid::Node,  ::Cabana::Grid::SparseMesh<float, (unsigned long)3ul> ,  ::Cabana::Grid::SparseMap< ::Kokkos::Serial, (unsigned long long)4ull, ( ::Cabana::Grid::HashTypes)0, unsigned long, unsigned long> > ::~SparseArray [subobject]") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/src/Cabana_Grid_SparseIndexSpace.hpp(1009): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("Cabana::Grid::Experimental::SparseArray< ::Cabana::MemberTypes<int [3], float, double [3], int [2] > ,  ::Kokkos::HostSpace,  ::Cabana::Grid::Cell,  ::Cabana::Grid::SparseMesh<float, (unsigned long)3ul> ,  ::Cabana::Grid::SparseMap< ::Kokkos::Serial, (unsigned long long)4ull, ( ::Cabana::Grid::HashTypes)0, unsigned long, unsigned long> > ::~SparseArray") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/src/Cabana_Grid_SparseIndexSpace.hpp(1009): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("Cabana::Grid::Experimental::SparseArray< ::Cabana::MemberTypes<int [3], float, double [3], int [2] > ,  ::Kokkos::HostSpace,  ::Cabana::Grid::Cell,  ::Cabana::Grid::SparseMesh<float, (unsigned long)3ul> ,  ::Cabana::Grid::SparseMap< ::Kokkos::Serial, (unsigned long long)4ull, ( ::Cabana::Grid::HashTypes)0, unsigned long, unsigned long> > ::~SparseArray [subobject]") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/src/Cabana_Grid_SparseIndexSpace.hpp(1009): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("Cabana::Grid::Experimental::SparseArray< ::Cabana::MemberTypes<int [3], float [3], int [2] > ,  ::Kokkos::HostSpace,  ::Cabana::Grid::Node,  ::Cabana::Grid::SparseMesh<float, (unsigned long)3ul> ,  ::Cabana::Grid::SparseMap< ::Kokkos::Serial, (unsigned long long)4ull, ( ::Cabana::Grid::HashTypes)0, unsigned long, unsigned long> > ::~SparseArray") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/src/Cabana_Grid_SparseIndexSpace.hpp(1009): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("Cabana::Grid::Experimental::SparseArray< ::Cabana::MemberTypes<int [3], float [3], int [2] > ,  ::Kokkos::HostSpace,  ::Cabana::Grid::Node,  ::Cabana::Grid::SparseMesh<float, (unsigned long)3ul> ,  ::Cabana::Grid::SparseMap< ::Kokkos::Serial, (unsigned long long)4ull, ( ::Cabana::Grid::HashTypes)0, unsigned long, unsigned long> > ::~SparseArray [subobject]") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/src/Cabana_Grid_SparseIndexSpace.hpp(1009): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("Cabana::Grid::Experimental::SparseArray< ::Cabana::MemberTypes<int [3], float [3], int [2] > ,  ::Kokkos::HostSpace,  ::Cabana::Grid::Cell,  ::Cabana::Grid::SparseMesh<float, (unsigned long)3ul> ,  ::Cabana::Grid::SparseMap< ::Kokkos::Serial, (unsigned long long)4ull, ( ::Cabana::Grid::HashTypes)0, unsigned long, unsigned long> > ::~SparseArray") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/src/Cabana_Grid_SparseIndexSpace.hpp(1009): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("Cabana::Grid::Experimental::SparseArray< ::Cabana::MemberTypes<int [3], float [3], int [2] > ,  ::Kokkos::HostSpace,  ::Cabana::Grid::Cell,  ::Cabana::Grid::SparseMesh<float, (unsigned long)3ul> ,  ::Cabana::Grid::SparseMap< ::Kokkos::Serial, (unsigned long long)4ull, ( ::Cabana::Grid::HashTypes)0, unsigned long, unsigned long> > ::~SparseArray [subobject]") is not allowed

[ 57%] Linking CXX executable Grid_SparseArray_MPI_test_SERIAL
[ 57%] Building CXX object grid/unit_test/CMakeFiles/Grid_LocalMesh2d_MPI_test_SERIAL.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 57%] Built target Grid_SparseArray_MPI_test_SERIAL
[ 58%] Linking CXX executable Grid_LocalMesh2d_MPI_test_SERIAL
Scanning dependencies of target Grid_GlobalParticleComm_MPI_test_CUDA_UVM
[ 58%] Building CXX object grid/unit_test/CMakeFiles/Grid_GlobalParticleComm_MPI_test_CUDA_UVM.dir/CUDA_UVM/tstGlobalParticleComm_CUDA_UVM.cpp.o
[ 58%] Building CXX object grid/unit_test/CMakeFiles/Grid_GlobalParticleComm_MPI_test_SERIAL.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 58%] Linking CXX executable Grid_GlobalParticleComm_MPI_test_SERIAL
[ 58%] Built target Grid_LocalMesh2d_MPI_test_SERIAL
Scanning dependencies of target Grid_LocalMesh3d_MPI_test_CUDA
[ 58%] Built target Grid_GlobalParticleComm_MPI_test_SERIAL
[ 58%] Building CXX object grid/unit_test/CMakeFiles/Grid_LocalMesh3d_MPI_test_CUDA.dir/CUDA/tstLocalMesh3d_CUDA.cpp.o
Scanning dependencies of target Grid_Array2d_MPI_test_CUDA
[ 58%] Building CXX object grid/unit_test/CMakeFiles/Grid_Array2d_MPI_test_CUDA.dir/CUDA/tstArray2d_CUDA.cpp.o
[ 58%] Building CXX object grid/unit_test/CMakeFiles/Grid_BovWriter_MPI_test_SERIAL.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 59%] Linking CXX executable Grid_BovWriter_MPI_test_SERIAL
[ 59%] Building CXX object grid/unit_test/CMakeFiles/Grid_SparseLocalGrid_MPI_test_SERIAL.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 59%] Linking CXX executable Grid_SparseLocalGrid_MPI_test_SERIAL
[ 59%] Building CXX object grid/unit_test/CMakeFiles/Grid_SparseIndexSpace_MPI_test_CUDA_UVM.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
[ 60%] Linking CXX executable Grid_SparseIndexSpace_MPI_test_CUDA_UVM
[ 60%] Built target Grid_BovWriter_MPI_test_SERIAL
Scanning dependencies of target Grid_Halo3d_MPI_test_CUDA
[ 60%] Building CXX object grid/unit_test/CMakeFiles/Grid_Halo3d_MPI_test_CUDA.dir/CUDA/tstHalo3d_CUDA.cpp.o
[ 60%] Built target Grid_SparseLocalGrid_MPI_test_SERIAL
Scanning dependencies of target Grid_Halo2d_MPI_test_CUDA
[ 60%] Building CXX object grid/unit_test/CMakeFiles/Grid_Halo2d_MPI_test_SERIAL.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 61%] Building CXX object grid/unit_test/CMakeFiles/Grid_Halo2d_MPI_test_CUDA.dir/CUDA/tstHalo2d_CUDA.cpp.o
[ 61%] Linking CXX executable Grid_Halo2d_MPI_test_SERIAL
[ 61%] Built target Grid_SparseIndexSpace_MPI_test_CUDA_UVM
Scanning dependencies of target Grid_IndexSpace_MPI_test_CUDA
[ 61%] Building CXX object grid/unit_test/CMakeFiles/Grid_IndexSpace_MPI_test_CUDA.dir/CUDA/tstIndexSpace_CUDA.cpp.o
[ 61%] Built target Grid_Halo2d_MPI_test_SERIAL
Scanning dependencies of target Grid_Interpolation2d_MPI_test_CUDA
[ 61%] Building CXX object grid/unit_test/CMakeFiles/Grid_Interpolation2d_MPI_test_CUDA.dir/CUDA/tstInterpolation2d_CUDA.cpp.o
[ 61%] Building CXX object grid/unit_test/CMakeFiles/Grid_ParticleList_MPI_test_CUDA_UVM.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 61%] Linking CXX executable Grid_ParticleList_MPI_test_CUDA_UVM
[ 61%] Built target Grid_ParticleList_MPI_test_CUDA_UVM
Scanning dependencies of target Grid_FastFourierTransform_MPI_test_CUDA
[ 61%] Building CXX object grid/unit_test/CMakeFiles/Grid_FastFourierTransform_MPI_test_CUDA.dir/CUDA/tstFastFourierTransform_CUDA.cpp.o
[ 61%] Building CXX object grid/unit_test/CMakeFiles/Grid_LocalMesh3d_MPI_test_CUDA.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 61%] Linking CXX executable Grid_LocalMesh3d_MPI_test_CUDA
[ 61%] Building CXX object grid/unit_test/CMakeFiles/Grid_Array2d_MPI_test_CUDA.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 61%] Linking CXX executable Grid_Array2d_MPI_test_CUDA
[ 61%] Built target Grid_LocalMesh3d_MPI_test_CUDA
[ 61%] Building CXX object grid/unit_test/CMakeFiles/Grid_FastFourierTransform_MPI_test_CUDA.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 62%] Building CXX object grid/unit_test/CMakeFiles/Grid_IndexSpace_MPI_test_CUDA.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
[ 62%] Building CXX object grid/unit_test/CMakeFiles/Grid_Interpolation2d_MPI_test_CUDA.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
Scanning dependencies of target Grid_ParticleGridDistributor3d_MPI_test_CUDA
[ 63%] Building CXX object grid/unit_test/CMakeFiles/Grid_GlobalParticleComm_MPI_test_CUDA_UVM.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 63%] Building CXX object grid/unit_test/CMakeFiles/Grid_ParticleGridDistributor3d_MPI_test_CUDA.dir/CUDA/tstParticleGridDistributor3d_CUDA.cpp.o
[ 63%] Linking CXX executable Grid_GlobalParticleComm_MPI_test_CUDA_UVM
[ 63%] Built target Grid_Array2d_MPI_test_CUDA
Scanning dependencies of target Grid_SplineEvaluation3d_MPI_test_CUDA
[ 63%] Building CXX object grid/unit_test/CMakeFiles/Grid_SplineEvaluation3d_MPI_test_CUDA.dir/CUDA/tstSplineEvaluation3d_CUDA.cpp.o
[ 63%] Built target Grid_GlobalParticleComm_MPI_test_CUDA_UVM
Scanning dependencies of target Grid_SplineEvaluation2d_MPI_test_CUDA
[ 63%] Building CXX object grid/unit_test/CMakeFiles/Grid_SplineEvaluation2d_MPI_test_CUDA.dir/CUDA/tstSplineEvaluation2d_CUDA.cpp.o
[ 63%] Building CXX object grid/unit_test/CMakeFiles/Grid_Halo3d_MPI_test_CUDA.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 63%] Building CXX object grid/unit_test/CMakeFiles/Grid_Halo2d_MPI_test_CUDA.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 63%] Linking CXX executable Grid_Halo2d_MPI_test_CUDA
[ 64%] Linking CXX executable Grid_Halo3d_MPI_test_CUDA
[ 64%] Linking CXX executable Grid_IndexSpace_MPI_test_CUDA
[ 64%] Built target Grid_Halo3d_MPI_test_CUDA
[ 64%] Building CXX object grid/unit_test/CMakeFiles/Grid_SplineEvaluation3d_MPI_test_CUDA.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 64%] Built target Grid_Halo2d_MPI_test_CUDA
[ 64%] Building CXX object grid/unit_test/CMakeFiles/Grid_ParticleGridDistributor3d_MPI_test_CUDA.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
Scanning dependencies of target Grid_Interpolation3d_MPI_test_CUDA
[ 64%] Building CXX object grid/unit_test/CMakeFiles/Grid_Interpolation3d_MPI_test_CUDA.dir/CUDA/tstInterpolation3d_CUDA.cpp.o
[ 64%] Built target Grid_IndexSpace_MPI_test_CUDA
Scanning dependencies of target Grid_HypreStructuredSolver2d_MPI_test_CUDA_UVM
[ 64%] Building CXX object grid/unit_test/CMakeFiles/Grid_HypreStructuredSolver2d_MPI_test_CUDA_UVM.dir/CUDA_UVM/tstHypreStructuredSolver2d_CUDA_UVM.cpp.o
Scanning dependencies of target Grid_BovWriter_MPI_test_CUDA
[ 64%] Building CXX object grid/unit_test/CMakeFiles/Grid_BovWriter_MPI_test_CUDA.dir/CUDA/tstBovWriter_CUDA.cpp.o
[ 64%] Linking CXX executable Grid_Interpolation2d_MPI_test_CUDA
[ 64%] Built target Grid_Interpolation2d_MPI_test_CUDA
[ 64%] Building CXX object grid/unit_test/CMakeFiles/Grid_HypreStructuredSolver2d_MPI_test_CUDA_UVM.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
Scanning dependencies of target Grid_Partitioner_MPI_test_CUDA
[ 64%] Building CXX object grid/unit_test/CMakeFiles/Grid_Partitioner_MPI_test_CUDA.dir/CUDA/tstPartitioner_CUDA.cpp.o
[ 64%] Linking CXX executable Grid_SplineEvaluation3d_MPI_test_CUDA
[ 64%] Linking CXX executable Grid_FastFourierTransform_MPI_test_CUDA
[ 64%] Linking CXX executable Grid_ParticleGridDistributor3d_MPI_test_CUDA
[ 64%] Linking CXX executable Grid_HypreStructuredSolver2d_MPI_test_CUDA_UVM
[ 64%] Built target Grid_SplineEvaluation3d_MPI_test_CUDA
Scanning dependencies of target Grid_ParticleList_MPI_test_CUDA
[ 64%] Building CXX object grid/unit_test/CMakeFiles/Grid_SplineEvaluation2d_MPI_test_CUDA.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 64%] Building CXX object grid/unit_test/CMakeFiles/Grid_ParticleList_MPI_test_CUDA.dir/CUDA/tstParticleList_CUDA.cpp.o
[ 64%] Linking CXX executable Grid_SplineEvaluation2d_MPI_test_CUDA
[ 64%] Built target Grid_FastFourierTransform_MPI_test_CUDA
Scanning dependencies of target Grid_GlobalParticleComm_MPI_test_CUDA
[ 64%] Building CXX object grid/unit_test/CMakeFiles/Grid_GlobalParticleComm_MPI_test_CUDA.dir/CUDA/tstGlobalParticleComm_CUDA.cpp.o
[ 64%] Built target Grid_ParticleGridDistributor3d_MPI_test_CUDA
[ 64%] Built target Grid_HypreStructuredSolver2d_MPI_test_CUDA_UVM
Scanning dependencies of target Grid_SparseArray_MPI_test_CUDA
[ 65%] Building CXX object grid/unit_test/CMakeFiles/Grid_SparseArray_MPI_test_CUDA.dir/CUDA/tstSparseArray_CUDA.cpp.o
Scanning dependencies of target Grid_SparseDimPartitioner_MPI_test_CUDA
[ 65%] Building CXX object grid/unit_test/CMakeFiles/Grid_SparseDimPartitioner_MPI_test_CUDA.dir/CUDA/tstSparseDimPartitioner_CUDA.cpp.o
[ 65%] Built target Grid_SplineEvaluation2d_MPI_test_CUDA
[ 65%] Building CXX object grid/unit_test/CMakeFiles/Grid_ParticleList_MPI_test_CUDA.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
Scanning dependencies of target Grid_Array2d_MPI_test_CUDA_UVM
[ 65%] Building CXX object grid/unit_test/CMakeFiles/Grid_Array2d_MPI_test_CUDA_UVM.dir/CUDA_UVM/tstArray2d_CUDA_UVM.cpp.o
[ 65%] Building CXX object grid/unit_test/CMakeFiles/Grid_Partitioner_MPI_test_CUDA.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 65%] Linking CXX executable Grid_Partitioner_MPI_test_CUDA
[ 65%] Building CXX object grid/unit_test/CMakeFiles/Grid_BovWriter_MPI_test_CUDA.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 66%] Linking CXX executable Grid_BovWriter_MPI_test_CUDA
[ 66%] Built target Grid_Partitioner_MPI_test_CUDA
Scanning dependencies of target Grid_LocalGrid_MPI_test_CUDA
[ 66%] Building CXX object grid/unit_test/CMakeFiles/Grid_LocalGrid_MPI_test_CUDA.dir/CUDA/tstLocalGrid_CUDA.cpp.o
[ 66%] Built target Grid_BovWriter_MPI_test_CUDA
[ 67%] Building CXX object grid/unit_test/CMakeFiles/Grid_LocalGrid_MPI_test_CUDA.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 67%] Building CXX object grid/unit_test/CMakeFiles/Grid_Interpolation3d_MPI_test_CUDA.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 67%] Building CXX object grid/unit_test/CMakeFiles/Grid_GlobalParticleComm_MPI_test_CUDA.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 67%] Linking CXX executable Grid_Interpolation3d_MPI_test_CUDA
Scanning dependencies of target Grid_SparseHalo_MPI_test_CUDA
[ 67%] Building CXX object grid/unit_test/CMakeFiles/Grid_SparseHalo_MPI_test_CUDA.dir/CUDA/tstSparseHalo_CUDA.cpp.o
[ 67%] Built target Grid_Interpolation3d_MPI_test_CUDA
Scanning dependencies of target Grid_Array3d_MPI_test_CUDA_UVM
[ 67%] Building CXX object grid/unit_test/CMakeFiles/Grid_Array3d_MPI_test_CUDA_UVM.dir/CUDA_UVM/tstArray3d_CUDA_UVM.cpp.o
[ 68%] Building CXX object grid/unit_test/CMakeFiles/Grid_SparseDimPartitioner_MPI_test_CUDA.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 69%] Linking CXX executable Grid_ParticleList_MPI_test_CUDA
[ 69%] Linking CXX executable Grid_SparseDimPartitioner_MPI_test_CUDA
[ 70%] Building CXX object grid/unit_test/CMakeFiles/Grid_Array2d_MPI_test_CUDA_UVM.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 70%] Linking CXX executable Grid_Array2d_MPI_test_CUDA_UVM
[ 70%] Built target Grid_ParticleList_MPI_test_CUDA
[ 70%] Linking CXX executable Grid_LocalGrid_MPI_test_CUDA
[ 70%] Built target Grid_SparseDimPartitioner_MPI_test_CUDA
Scanning dependencies of target Grid_IndexSpace_MPI_test_CUDA_UVM
[ 70%] Building CXX object grid/unit_test/CMakeFiles/Grid_IndexSpace_MPI_test_CUDA_UVM.dir/CUDA_UVM/tstIndexSpace_CUDA_UVM.cpp.o
[ 70%] Built target Grid_Array2d_MPI_test_CUDA_UVM
Scanning dependencies of target Grid_HypreStructuredSolver3d_MPI_test_CUDA
[ 70%] Building CXX object grid/unit_test/CMakeFiles/Grid_HypreStructuredSolver3d_MPI_test_CUDA.dir/CUDA/tstHypreStructuredSolver3d_CUDA.cpp.o
Scanning dependencies of target Grid_Parallel_MPI_test_SERIAL
[ 71%] Building CXX object grid/unit_test/CMakeFiles/Grid_Parallel_MPI_test_SERIAL.dir/SERIAL/tstParallel_SERIAL.cpp.o
[ 71%] Built target Grid_LocalGrid_MPI_test_CUDA
[ 71%] Building CXX object grid/unit_test/CMakeFiles/Grid_SparseHalo_MPI_test_CUDA.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 71%] Linking CXX executable Grid_GlobalParticleComm_MPI_test_CUDA
Scanning dependencies of target Grid_HypreStructuredSolver2d_MPI_test_CUDA
[ 71%] Building CXX object grid/unit_test/CMakeFiles/Grid_HypreStructuredSolver2d_MPI_test_CUDA.dir/CUDA/tstHypreStructuredSolver2d_CUDA.cpp.o
/var/jenkins/workspace/Cabana_PR-743/grid/unit_test/tstSparseArray.hpp(367): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::__shared_count") from a __host__ __device__ function("") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/unit_test/tstSparseArray.hpp(367): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/unit_test/tstSparseArray.hpp(367): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::__shared_count") from a __host__ __device__ function("") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/unit_test/tstSparseArray.hpp(367): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/unit_test/tstSparseArray.hpp(554): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::__shared_count") from a __host__ __device__ function("") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/unit_test/tstSparseArray.hpp(554): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/unit_test/tstSparseArray.hpp(554): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::__shared_count") from a __host__ __device__ function("") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/unit_test/tstSparseArray.hpp(554): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/src/Cabana_Grid_SparseIndexSpace.hpp(1009): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("Cabana::Grid::Experimental::SparseArray< ::Cabana::MemberTypes<int [3], float, double [3], int [2] > ,  ::Kokkos::CudaSpace,  ::Cabana::Grid::Node,  ::Cabana::Grid::SparseMesh<float, (unsigned long)3ul> ,  ::Cabana::Grid::SparseMap< ::Kokkos::Cuda, (unsigned long long)4ull, ( ::Cabana::Grid::HashTypes)0, unsigned long, unsigned long> > ::~SparseArray") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/src/Cabana_Grid_SparseIndexSpace.hpp(1009): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("Cabana::Grid::Experimental::SparseArray< ::Cabana::MemberTypes<int [3], float, double [3], int [2] > ,  ::Kokkos::CudaSpace,  ::Cabana::Grid::Node,  ::Cabana::Grid::SparseMesh<float, (unsigned long)3ul> ,  ::Cabana::Grid::SparseMap< ::Kokkos::Cuda, (unsigned long long)4ull, ( ::Cabana::Grid::HashTypes)0, unsigned long, unsigned long> > ::~SparseArray [subobject]") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/src/Cabana_Grid_SparseIndexSpace.hpp(1009): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("Cabana::Grid::Experimental::SparseArray< ::Cabana::MemberTypes<int [3], float, double [3], int [2] > ,  ::Kokkos::CudaSpace,  ::Cabana::Grid::Cell,  ::Cabana::Grid::SparseMesh<float, (unsigned long)3ul> ,  ::Cabana::Grid::SparseMap< ::Kokkos::Cuda, (unsigned long long)4ull, ( ::Cabana::Grid::HashTypes)0, unsigned long, unsigned long> > ::~SparseArray") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/src/Cabana_Grid_SparseIndexSpace.hpp(1009): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("Cabana::Grid::Experimental::SparseArray< ::Cabana::MemberTypes<int [3], float, double [3], int [2] > ,  ::Kokkos::CudaSpace,  ::Cabana::Grid::Cell,  ::Cabana::Grid::SparseMesh<float, (unsigned long)3ul> ,  ::Cabana::Grid::SparseMap< ::Kokkos::Cuda, (unsigned long long)4ull, ( ::Cabana::Grid::HashTypes)0, unsigned long, unsigned long> > ::~SparseArray [subobject]") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/src/Cabana_Grid_SparseIndexSpace.hpp(1009): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("Cabana::Grid::Experimental::SparseArray< ::Cabana::MemberTypes<int [3], float [3], int [2] > ,  ::Kokkos::CudaSpace,  ::Cabana::Grid::Node,  ::Cabana::Grid::SparseMesh<float, (unsigned long)3ul> ,  ::Cabana::Grid::SparseMap< ::Kokkos::Cuda, (unsigned long long)4ull, ( ::Cabana::Grid::HashTypes)0, unsigned long, unsigned long> > ::~SparseArray") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/src/Cabana_Grid_SparseIndexSpace.hpp(1009): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("Cabana::Grid::Experimental::SparseArray< ::Cabana::MemberTypes<int [3], float [3], int [2] > ,  ::Kokkos::CudaSpace,  ::Cabana::Grid::Node,  ::Cabana::Grid::SparseMesh<float, (unsigned long)3ul> ,  ::Cabana::Grid::SparseMap< ::Kokkos::Cuda, (unsigned long long)4ull, ( ::Cabana::Grid::HashTypes)0, unsigned long, unsigned long> > ::~SparseArray [subobject]") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/src/Cabana_Grid_SparseIndexSpace.hpp(1009): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("Cabana::Grid::Experimental::SparseArray< ::Cabana::MemberTypes<int [3], float [3], int [2] > ,  ::Kokkos::CudaSpace,  ::Cabana::Grid::Cell,  ::Cabana::Grid::SparseMesh<float, (unsigned long)3ul> ,  ::Cabana::Grid::SparseMap< ::Kokkos::Cuda, (unsigned long long)4ull, ( ::Cabana::Grid::HashTypes)0, unsigned long, unsigned long> > ::~SparseArray") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/src/Cabana_Grid_SparseIndexSpace.hpp(1009): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("Cabana::Grid::Experimental::SparseArray< ::Cabana::MemberTypes<int [3], float [3], int [2] > ,  ::Kokkos::CudaSpace,  ::Cabana::Grid::Cell,  ::Cabana::Grid::SparseMesh<float, (unsigned long)3ul> ,  ::Cabana::Grid::SparseMap< ::Kokkos::Cuda, (unsigned long long)4ull, ( ::Cabana::Grid::HashTypes)0, unsigned long, unsigned long> > ::~SparseArray [subobject]") is not allowed

[ 71%] Building CXX object grid/unit_test/CMakeFiles/Grid_SparseArray_MPI_test_CUDA.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 71%] Linking CXX executable Grid_SparseArray_MPI_test_CUDA
[ 71%] Built target Grid_GlobalParticleComm_MPI_test_CUDA
Scanning dependencies of target Grid_HypreSemiStructuredSolver_MPI_test_CUDA
[ 72%] Building CXX object grid/unit_test/CMakeFiles/Grid_HypreSemiStructuredSolver_MPI_test_CUDA.dir/CUDA/tstHypreSemiStructuredSolver_CUDA.cpp.o
[ 72%] Built target Grid_SparseArray_MPI_test_CUDA
[ 72%] Building CXX object grid/unit_test/CMakeFiles/Grid_Parallel_MPI_test_SERIAL.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 72%] Building CXX object grid/unit_test/CMakeFiles/Grid_HypreSemiStructuredSolver_MPI_test_CUDA.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 72%] Building CXX object grid/unit_test/CMakeFiles/Grid_HypreStructuredSolver3d_MPI_test_CUDA.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 73%] Building CXX object grid/unit_test/CMakeFiles/Grid_HypreStructuredSolver2d_MPI_test_CUDA.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
Scanning dependencies of target Grid_GlobalGrid_MPI_test_CUDA_UVM
[ 73%] Building CXX object grid/unit_test/CMakeFiles/Grid_GlobalGrid_MPI_test_CUDA_UVM.dir/CUDA_UVM/tstGlobalGrid_CUDA_UVM.cpp.o
/var/jenkins/workspace/Cabana_PR-743/grid/unit_test/tstSparseHalo.hpp(601): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::__shared_count") from a __host__ __device__ function("") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/unit_test/tstSparseHalo.hpp(601): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/src/Cabana_Grid_SparseIndexSpace.hpp(1009): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("Cabana::Grid::Experimental::SparseArray< ::Cabana::MemberTypes<double [3], float > ,  ::Kokkos::CudaSpace,  ::Cabana::Grid::Node,  ::Cabana::Grid::SparseMesh<float, (unsigned long)3ul> ,  ::Cabana::Grid::SparseMap< ::Kokkos::Cuda, (unsigned long long)4ull, ( ::Cabana::Grid::HashTypes)0, unsigned long, unsigned long> > ::~SparseArray") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/src/Cabana_Grid_SparseIndexSpace.hpp(1009): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("Cabana::Grid::Experimental::SparseArray< ::Cabana::MemberTypes<double [3], float > ,  ::Kokkos::CudaSpace,  ::Cabana::Grid::Node,  ::Cabana::Grid::SparseMesh<float, (unsigned long)3ul> ,  ::Cabana::Grid::SparseMap< ::Kokkos::Cuda, (unsigned long long)4ull, ( ::Cabana::Grid::HashTypes)0, unsigned long, unsigned long> > ::~SparseArray [subobject]") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/src/Cabana_Grid_SparseHalo.hpp(922): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::__shared_count") from a __host__ __device__ function("") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/src/Cabana_Grid_SparseHalo.hpp(922): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/src/Cabana_Grid_SparseHalo.hpp(1184): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::__shared_count") from a __host__ __device__ function("") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/src/Cabana_Grid_SparseHalo.hpp(1184): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/src/Cabana_Grid_SparseHalo.hpp(1184): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::__shared_count") from a __host__ __device__ function("") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/src/Cabana_Grid_SparseHalo.hpp(1184): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("") is not allowed

[ 73%] Building CXX object grid/unit_test/CMakeFiles/Grid_Array3d_MPI_test_CUDA_UVM.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 73%] Linking CXX executable Grid_SparseHalo_MPI_test_CUDA
[ 73%] Linking CXX executable Grid_Array3d_MPI_test_CUDA_UVM
[ 73%] Built target Grid_Array3d_MPI_test_CUDA_UVM
[ 73%] Built target Grid_SparseHalo_MPI_test_CUDA
Scanning dependencies of target Grid_Halo3d_MPI_test_CUDA_UVM
Scanning dependencies of target Grid_LocalGrid_MPI_test_CUDA_UVM
[ 73%] Building CXX object grid/unit_test/CMakeFiles/Grid_Halo3d_MPI_test_CUDA_UVM.dir/CUDA_UVM/tstHalo3d_CUDA_UVM.cpp.o
[ 73%] Building CXX object grid/unit_test/CMakeFiles/Grid_LocalGrid_MPI_test_CUDA_UVM.dir/CUDA_UVM/tstLocalGrid_CUDA_UVM.cpp.o
[ 73%] Linking CXX executable Grid_Parallel_MPI_test_SERIAL
[ 73%] Building CXX object grid/unit_test/CMakeFiles/Grid_IndexSpace_MPI_test_CUDA_UVM.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
[ 73%] Linking CXX executable Grid_IndexSpace_MPI_test_CUDA_UVM
[ 73%] Built target Grid_Parallel_MPI_test_SERIAL
Scanning dependencies of target Grid_HypreSemiStructuredSolver_MPI_test_CUDA_UVM
[ 73%] Building CXX object grid/unit_test/CMakeFiles/Grid_HypreSemiStructuredSolver_MPI_test_CUDA_UVM.dir/CUDA_UVM/tstHypreSemiStructuredSolver_CUDA_UVM.cpp.o
[ 73%] Built target Grid_IndexSpace_MPI_test_CUDA_UVM
Scanning dependencies of target Grid_ParticleInit_MPI_test_SERIAL
[ 73%] Building CXX object grid/unit_test/CMakeFiles/Grid_ParticleInit_MPI_test_SERIAL.dir/SERIAL/tstParticleInit_SERIAL.cpp.o
[ 73%] Linking CXX executable Grid_HypreStructuredSolver3d_MPI_test_CUDA
[ 73%] Linking CXX executable Grid_HypreStructuredSolver2d_MPI_test_CUDA
[ 73%] Building CXX object grid/unit_test/CMakeFiles/Grid_GlobalGrid_MPI_test_CUDA_UVM.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 73%] Linking CXX executable Grid_GlobalGrid_MPI_test_CUDA_UVM
[ 73%] Built target Grid_HypreStructuredSolver3d_MPI_test_CUDA
Scanning dependencies of target Grid_LocalMesh3d_MPI_test_CUDA_UVM
[ 73%] Building CXX object grid/unit_test/CMakeFiles/Grid_LocalMesh3d_MPI_test_CUDA_UVM.dir/CUDA_UVM/tstLocalMesh3d_CUDA_UVM.cpp.o
[ 73%] Built target Grid_HypreStructuredSolver2d_MPI_test_CUDA
[ 73%] Building CXX object grid/unit_test/CMakeFiles/Grid_HypreSemiStructuredSolver_MPI_test_CUDA_UVM.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 73%] Linking CXX executable Grid_HypreSemiStructuredSolver_MPI_test_CUDA
Scanning dependencies of target Grid_Halo2d_MPI_test_CUDA_UVM
[ 73%] Built target Grid_GlobalGrid_MPI_test_CUDA_UVM
[ 73%] Building CXX object grid/unit_test/CMakeFiles/Grid_Halo2d_MPI_test_CUDA_UVM.dir/CUDA_UVM/tstHalo2d_CUDA_UVM.cpp.o
Scanning dependencies of target Grid_Splines_MPI_test_CUDA
[ 73%] Building CXX object grid/unit_test/CMakeFiles/Grid_Splines_MPI_test_CUDA.dir/CUDA/tstSplines_CUDA.cpp.o
[ 73%] Built target Grid_HypreSemiStructuredSolver_MPI_test_CUDA
[ 73%] Building CXX object grid/unit_test/CMakeFiles/Grid_LocalGrid_MPI_test_CUDA_UVM.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
Scanning dependencies of target Grid_Interpolation3d_MPI_test_CUDA_UVM
[ 73%] Linking CXX executable Grid_LocalGrid_MPI_test_CUDA_UVM
[ 73%] Building CXX object grid/unit_test/CMakeFiles/Grid_Interpolation3d_MPI_test_CUDA_UVM.dir/CUDA_UVM/tstInterpolation3d_CUDA_UVM.cpp.o
[ 74%] Linking CXX executable Grid_HypreSemiStructuredSolver_MPI_test_CUDA_UVM
[ 74%] Built target Grid_LocalGrid_MPI_test_CUDA_UVM
[ 75%] Building CXX object grid/unit_test/CMakeFiles/Grid_Interpolation3d_MPI_test_CUDA_UVM.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
Scanning dependencies of target Grid_IndexConversion_MPI_test_CUDA_UVM
[ 75%] Building CXX object grid/unit_test/CMakeFiles/Grid_IndexConversion_MPI_test_CUDA_UVM.dir/CUDA_UVM/tstIndexConversion_CUDA_UVM.cpp.o
[ 75%] Built target Grid_HypreSemiStructuredSolver_MPI_test_CUDA_UVM
Scanning dependencies of target Grid_ParticleInit_MPI_test_CUDA_UVM
[ 75%] Building CXX object grid/unit_test/CMakeFiles/Grid_ParticleInit_MPI_test_CUDA_UVM.dir/CUDA_UVM/tstParticleInit_CUDA_UVM.cpp.o
[ 75%] Building CXX object grid/unit_test/CMakeFiles/Grid_Halo3d_MPI_test_CUDA_UVM.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 75%] Linking CXX executable Grid_Halo3d_MPI_test_CUDA_UVM
[ 76%] Building CXX object grid/unit_test/CMakeFiles/Grid_Splines_MPI_test_CUDA.dir/__/__/cmake/test_harness/unit_test_main.cpp.o
[ 76%] Building CXX object grid/unit_test/CMakeFiles/Grid_ParticleInit_MPI_test_SERIAL.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 76%] Linking CXX executable Grid_Splines_MPI_test_CUDA
[ 76%] Linking CXX executable Grid_ParticleInit_MPI_test_SERIAL
[ 76%] Built target Grid_Halo3d_MPI_test_CUDA_UVM
Scanning dependencies of target Grid_ParticleGridDistributor2d_MPI_test_CUDA_UVM
[ 76%] Building CXX object grid/unit_test/CMakeFiles/Grid_ParticleGridDistributor2d_MPI_test_CUDA_UVM.dir/CUDA_UVM/tstParticleGridDistributor2d_CUDA_UVM.cpp.o
[ 76%] Built target Grid_Splines_MPI_test_CUDA
Scanning dependencies of target Grid_SplineEvaluation3d_MPI_test_CUDA_UVM
[ 76%] Building CXX object grid/unit_test/CMakeFiles/Grid_SplineEvaluation3d_MPI_test_CUDA_UVM.dir/CUDA_UVM/tstSplineEvaluation3d_CUDA_UVM.cpp.o
[ 76%] Built target Grid_ParticleInit_MPI_test_SERIAL
Scanning dependencies of target Grid_BovWriter_MPI_test_CUDA_UVM
[ 76%] Building CXX object grid/unit_test/CMakeFiles/Grid_BovWriter_MPI_test_CUDA_UVM.dir/CUDA_UVM/tstBovWriter_CUDA_UVM.cpp.o
[ 77%] Building CXX object grid/unit_test/CMakeFiles/Grid_LocalMesh3d_MPI_test_CUDA_UVM.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 77%] Linking CXX executable Grid_LocalMesh3d_MPI_test_CUDA_UVM
[ 77%] Building CXX object grid/unit_test/CMakeFiles/Grid_Halo2d_MPI_test_CUDA_UVM.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 78%] Linking CXX executable Grid_Halo2d_MPI_test_CUDA_UVM
[ 78%] Built target Grid_LocalMesh3d_MPI_test_CUDA_UVM
[ 78%] Built target Grid_Halo2d_MPI_test_CUDA_UVM
Scanning dependencies of target Grid_Partitioner_MPI_test_SERIAL
[ 78%] Building CXX object grid/unit_test/CMakeFiles/Grid_Partitioner_MPI_test_SERIAL.dir/SERIAL/tstPartitioner_SERIAL.cpp.o
Scanning dependencies of target Grid_SplineEvaluation2d_MPI_test_CUDA_UVM
[ 79%] Building CXX object grid/unit_test/CMakeFiles/Grid_SplineEvaluation2d_MPI_test_CUDA_UVM.dir/CUDA_UVM/tstSplineEvaluation2d_CUDA_UVM.cpp.o
[ 79%] Linking CXX executable Grid_Interpolation3d_MPI_test_CUDA_UVM
[ 79%] Building CXX object grid/unit_test/CMakeFiles/Grid_IndexConversion_MPI_test_CUDA_UVM.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 80%] Linking CXX executable Grid_IndexConversion_MPI_test_CUDA_UVM
[ 80%] Building CXX object grid/unit_test/CMakeFiles/Grid_ParticleInit_MPI_test_CUDA_UVM.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 81%] Linking CXX executable Grid_ParticleInit_MPI_test_CUDA_UVM
[ 81%] Built target Grid_Interpolation3d_MPI_test_CUDA_UVM
Scanning dependencies of target Grid_Interpolation2d_MPI_test_CUDA_UVM
[ 82%] Building CXX object grid/unit_test/CMakeFiles/Grid_Interpolation2d_MPI_test_CUDA_UVM.dir/CUDA_UVM/tstInterpolation2d_CUDA_UVM.cpp.o
[ 83%] Building CXX object grid/unit_test/CMakeFiles/Grid_SplineEvaluation3d_MPI_test_CUDA_UVM.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 83%] Linking CXX executable Grid_SplineEvaluation3d_MPI_test_CUDA_UVM
[ 83%] Built target Grid_IndexConversion_MPI_test_CUDA_UVM
Scanning dependencies of target Grid_SparseArray_MPI_test_CUDA_UVM
[ 83%] Building CXX object grid/unit_test/CMakeFiles/Grid_SparseArray_MPI_test_CUDA_UVM.dir/CUDA_UVM/tstSparseArray_CUDA_UVM.cpp.o
[ 83%] Built target Grid_ParticleInit_MPI_test_CUDA_UVM
[ 83%] Building CXX object grid/unit_test/CMakeFiles/Grid_SparseArray_MPI_test_CUDA_UVM.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
Scanning dependencies of target Grid_SparseDimPartitioner_MPI_test_CUDA_UVM
[ 83%] Building CXX object grid/unit_test/CMakeFiles/Grid_SparseDimPartitioner_MPI_test_CUDA_UVM.dir/CUDA_UVM/tstSparseDimPartitioner_CUDA_UVM.cpp.o
[ 83%] Building CXX object grid/unit_test/CMakeFiles/Grid_BovWriter_MPI_test_CUDA_UVM.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 83%] Linking CXX executable Grid_BovWriter_MPI_test_CUDA_UVM
[ 83%] Built target Grid_SplineEvaluation3d_MPI_test_CUDA_UVM
Scanning dependencies of target HelloWorld
[ 84%] Building CXX object example/core_tutorial/01_hello_world/CMakeFiles/HelloWorld.dir/hello_world.cpp.o
[ 84%] Building CXX object grid/unit_test/CMakeFiles/Grid_ParticleGridDistributor2d_MPI_test_CUDA_UVM.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 84%] Built target Grid_BovWriter_MPI_test_CUDA_UVM
[ 84%] Building CXX object grid/unit_test/CMakeFiles/Grid_SplineEvaluation2d_MPI_test_CUDA_UVM.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 84%] Linking CXX executable Grid_ParticleGridDistributor2d_MPI_test_CUDA_UVM
[ 84%] Building CXX object grid/unit_test/CMakeFiles/Grid_SparseDimPartitioner_MPI_test_CUDA_UVM.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 84%] Building CXX object grid/unit_test/CMakeFiles/Grid_Interpolation2d_MPI_test_CUDA_UVM.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 84%] Building CXX object grid/unit_test/CMakeFiles/Grid_Partitioner_MPI_test_SERIAL.dir/__/__/cmake/test_harness/mpi_unit_test_main.cpp.o
[ 84%] Linking CXX executable Grid_Partitioner_MPI_test_SERIAL
Scanning dependencies of target Tuple
[ 84%] Building CXX object example/core_tutorial/02_tuple/CMakeFiles/Tuple.dir/tuple_example.cpp.o
[ 84%] Built target Grid_ParticleGridDistributor2d_MPI_test_CUDA_UVM
Scanning dependencies of target StructOfArrays
[ 84%] Building CXX object example/core_tutorial/03_struct_of_arrays/CMakeFiles/StructOfArrays.dir/soa_example.cpp.o
[ 84%] Built target Grid_Partitioner_MPI_test_SERIAL
Scanning dependencies of target AdvancedUnmanagedAoSoA
[ 84%] Building CXX object example/core_tutorial/04_aosoa_advanced_unmanaged/CMakeFiles/AdvancedUnmanagedAoSoA.dir/advanced_aosoa_unmanaged.cpp.o
[ 84%] Linking CXX executable Grid_SplineEvaluation2d_MPI_test_CUDA_UVM
[ 84%] Built target Grid_SplineEvaluation2d_MPI_test_CUDA_UVM
Scanning dependencies of target ArrayOfStructsOfArrays
[ 84%] Building CXX object example/core_tutorial/04_aosoa/CMakeFiles/ArrayOfStructsOfArrays.dir/aosoa_example.cpp.o
[ 84%] Linking CXX executable HelloWorld
[ 84%] Built target HelloWorld
[ 84%] Linking CXX executable Tuple
Scanning dependencies of target Slice
[ 84%] Building CXX object example/core_tutorial/05_slice/CMakeFiles/Slice.dir/slice_example.cpp.o
[ 84%] Linking CXX executable StructOfArrays
[ 84%] Built target Tuple
Scanning dependencies of target DeepCopy
[ 84%] Building CXX object example/core_tutorial/06_deep_copy/CMakeFiles/DeepCopy.dir/deep_copy_example.cpp.o
[ 84%] Linking CXX executable Grid_Interpolation2d_MPI_test_CUDA_UVM
[ 84%] Built target StructOfArrays
Scanning dependencies of target Sorting
[ 85%] Building CXX object example/core_tutorial/07_sorting/CMakeFiles/Sorting.dir/sorting_example.cpp.o
[ 85%] Linking CXX executable Grid_SparseDimPartitioner_MPI_test_CUDA_UVM
[ 85%] Built target Grid_Interpolation2d_MPI_test_CUDA_UVM
[ 85%] Linking CXX executable AdvancedUnmanagedAoSoA
Scanning dependencies of target LinkedCellList
[ 85%] Building CXX object example/core_tutorial/08_linked_cell_list/CMakeFiles/LinkedCellList.dir/linked_cell_list_example.cpp.o
[ 85%] Built target Grid_SparseDimPartitioner_MPI_test_CUDA_UVM
Scanning dependencies of target VerletList
[ 85%] Building CXX object example/core_tutorial/09_neighbor_list/CMakeFiles/VerletList.dir/verlet_list_example.cpp.o
[ 85%] Built target AdvancedUnmanagedAoSoA
Scanning dependencies of target ArborXList
[ 85%] Building CXX object example/core_tutorial/09_neighbor_list_arborx/CMakeFiles/ArborXList.dir/arborx_neighborlist_example.cpp.o
[ 85%] Linking CXX executable ArrayOfStructsOfArrays
[ 85%] Built target ArrayOfStructsOfArrays
Scanning dependencies of target NeighParallelFor
[ 85%] Building CXX object example/core_tutorial/10_neighbor_parallel_for/CMakeFiles/NeighParallelFor.dir/neighbor_parallel_for_example.cpp.o
/var/jenkins/workspace/Cabana_PR-743/grid/unit_test/tstSparseArray.hpp(367): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::__shared_count") from a __host__ __device__ function("") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/unit_test/tstSparseArray.hpp(367): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/unit_test/tstSparseArray.hpp(367): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::__shared_count") from a __host__ __device__ function("") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/unit_test/tstSparseArray.hpp(367): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/unit_test/tstSparseArray.hpp(554): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::__shared_count") from a __host__ __device__ function("") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/unit_test/tstSparseArray.hpp(554): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/unit_test/tstSparseArray.hpp(554): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::__shared_count") from a __host__ __device__ function("") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/unit_test/tstSparseArray.hpp(554): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/src/Cabana_Grid_SparseIndexSpace.hpp(1009): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("Cabana::Grid::Experimental::SparseArray< ::Cabana::MemberTypes<int [3], float, double [3], int [2] > ,  ::Kokkos::CudaUVMSpace,  ::Cabana::Grid::Node,  ::Cabana::Grid::SparseMesh<float, (unsigned long)3ul> ,  ::Cabana::Grid::SparseMap< ::Kokkos::Cuda, (unsigned long long)4ull, ( ::Cabana::Grid::HashTypes)0, unsigned long, unsigned long> > ::~SparseArray") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/src/Cabana_Grid_SparseIndexSpace.hpp(1009): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("Cabana::Grid::Experimental::SparseArray< ::Cabana::MemberTypes<int [3], float, double [3], int [2] > ,  ::Kokkos::CudaUVMSpace,  ::Cabana::Grid::Node,  ::Cabana::Grid::SparseMesh<float, (unsigned long)3ul> ,  ::Cabana::Grid::SparseMap< ::Kokkos::Cuda, (unsigned long long)4ull, ( ::Cabana::Grid::HashTypes)0, unsigned long, unsigned long> > ::~SparseArray [subobject]") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/src/Cabana_Grid_SparseIndexSpace.hpp(1009): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("Cabana::Grid::Experimental::SparseArray< ::Cabana::MemberTypes<int [3], float, double [3], int [2] > ,  ::Kokkos::CudaUVMSpace,  ::Cabana::Grid::Cell,  ::Cabana::Grid::SparseMesh<float, (unsigned long)3ul> ,  ::Cabana::Grid::SparseMap< ::Kokkos::Cuda, (unsigned long long)4ull, ( ::Cabana::Grid::HashTypes)0, unsigned long, unsigned long> > ::~SparseArray") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/src/Cabana_Grid_SparseIndexSpace.hpp(1009): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("Cabana::Grid::Experimental::SparseArray< ::Cabana::MemberTypes<int [3], float, double [3], int [2] > ,  ::Kokkos::CudaUVMSpace,  ::Cabana::Grid::Cell,  ::Cabana::Grid::SparseMesh<float, (unsigned long)3ul> ,  ::Cabana::Grid::SparseMap< ::Kokkos::Cuda, (unsigned long long)4ull, ( ::Cabana::Grid::HashTypes)0, unsigned long, unsigned long> > ::~SparseArray [subobject]") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/src/Cabana_Grid_SparseIndexSpace.hpp(1009): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("Cabana::Grid::Experimental::SparseArray< ::Cabana::MemberTypes<int [3], float [3], int [2] > ,  ::Kokkos::CudaUVMSpace,  ::Cabana::Grid::Node,  ::Cabana::Grid::SparseMesh<float, (unsigned long)3ul> ,  ::Cabana::Grid::SparseMap< ::Kokkos::Cuda, (unsigned long long)4ull, ( ::Cabana::Grid::HashTypes)0, unsigned long, unsigned long> > ::~SparseArray") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/src/Cabana_Grid_SparseIndexSpace.hpp(1009): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("Cabana::Grid::Experimental::SparseArray< ::Cabana::MemberTypes<int [3], float [3], int [2] > ,  ::Kokkos::CudaUVMSpace,  ::Cabana::Grid::Node,  ::Cabana::Grid::SparseMesh<float, (unsigned long)3ul> ,  ::Cabana::Grid::SparseMap< ::Kokkos::Cuda, (unsigned long long)4ull, ( ::Cabana::Grid::HashTypes)0, unsigned long, unsigned long> > ::~SparseArray [subobject]") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/src/Cabana_Grid_SparseIndexSpace.hpp(1009): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("Cabana::Grid::Experimental::SparseArray< ::Cabana::MemberTypes<int [3], float [3], int [2] > ,  ::Kokkos::CudaUVMSpace,  ::Cabana::Grid::Cell,  ::Cabana::Grid::SparseMesh<float, (unsigned long)3ul> ,  ::Cabana::Grid::SparseMap< ::Kokkos::Cuda, (unsigned long long)4ull, ( ::Cabana::Grid::HashTypes)0, unsigned long, unsigned long> > ::~SparseArray") is not allowed

/var/jenkins/workspace/Cabana_PR-743/grid/src/Cabana_Grid_SparseIndexSpace.hpp(1009): warning: calling a __host__ function("std::__shared_count<( ::__gnu_cxx::_Lock_policy)2> ::~__shared_count") from a __host__ __device__ function("Cabana::Grid::Experimental::SparseArray< ::Cabana::MemberTypes<int [3], float [3], int [2] > ,  ::Kokkos::CudaUVMSpace,  ::Cabana::Grid::Cell,  ::Cabana::Grid::SparseMesh<float, (unsigned long)3ul> ,  ::Cabana::Grid::SparseMap< ::Kokkos::Cuda, (unsigned long long)4ull, ( ::Cabana::Grid::HashTypes)0, unsigned long, unsigned long> > ::~SparseArray [subobject]") is not allowed

[ 86%] Linking CXX executable Grid_SparseArray_MPI_test_CUDA_UVM
[ 86%] Built target Grid_SparseArray_MPI_test_CUDA_UVM
Scanning dependencies of target SimdParallelFor
[ 86%] Linking CXX executable Slice
[ 86%] Building CXX object example/core_tutorial/10_simd_parallel_for/CMakeFiles/SimdParallelFor.dir/simd_parallel_for_example.cpp.o
[ 86%] Built target Slice
Scanning dependencies of target Migration
[ 86%] Building CXX object example/core_tutorial/11_migration/CMakeFiles/Migration.dir/migration_example.cpp.o
[ 86%] Linking CXX executable Sorting
[ 86%] Built target Sorting
Scanning dependencies of target HaloExchange
[ 87%] Building CXX object example/core_tutorial/12_halo_exchange/CMakeFiles/HaloExchange.dir/halo_exchange_example.cpp.o
[ 88%] Linking CXX executable VerletList
[ 88%] Linking CXX executable LinkedCellList
[ 88%] Built target VerletList
Scanning dependencies of target AdvancedCudaSlice
[ 88%] Built target LinkedCellList
[ 88%] Building CXX object example/core_tutorial/05_slice_advanced_cuda/CMakeFiles/AdvancedCudaSlice.dir/advanced_slice_cuda.cpp.o
Scanning dependencies of target MeshTypes
[ 89%] Building CXX object example/grid_tutorial/01_types/CMakeFiles/MeshTypes.dir/types_example.cpp.o
[ 90%] Linking CXX executable NeighParallelFor
[ 90%] Linking CXX executable DeepCopy
[ 90%] Built target NeighParallelFor
[ 90%] Built target DeepCopy
Scanning dependencies of target GlobalMesh
Scanning dependencies of target Partitioner
[ 90%] Building CXX object example/grid_tutorial/02_global_mesh/CMakeFiles/GlobalMesh.dir/global_mesh_example.cpp.o
[ 91%] Building CXX object example/grid_tutorial/03_partitioner/CMakeFiles/Partitioner.dir/partitioner_example.cpp.o
[ 92%] Linking CXX executable ArborXList
[ 92%] Built target ArborXList
Scanning dependencies of target GlobalGrid
[ 93%] Building CXX object example/grid_tutorial/04_global_grid/CMakeFiles/GlobalGrid.dir/global_grid_example.cpp.o
[ 93%] Linking CXX executable SimdParallelFor
[ 93%] Built target SimdParallelFor
Scanning dependencies of target IndexSpace
[ 93%] Building CXX object example/grid_tutorial/05_index_space/CMakeFiles/IndexSpace.dir/index_space_example.cpp.o
[ 93%] Linking CXX executable Migration
[ 93%] Linking CXX executable MeshTypes
[ 93%] Built target Migration
Scanning dependencies of target LocalGrid
[ 93%] Building CXX object example/grid_tutorial/06_local_grid/CMakeFiles/LocalGrid.dir/local_grid_example.cpp.o
[ 93%] Linking CXX executable AdvancedCudaSlice
[ 93%] Built target MeshTypes
Scanning dependencies of target LocalMesh
[ 93%] Building CXX object example/grid_tutorial/07_local_mesh/CMakeFiles/LocalMesh.dir/local_mesh_example.cpp.o
[ 93%] Linking CXX executable Partitioner
[ 93%] Linking CXX executable HaloExchange
[ 93%] Built target AdvancedCudaSlice
Scanning dependencies of target Array
[ 93%] Built target HaloExchange
[ 93%] Building CXX object example/grid_tutorial/08_array/CMakeFiles/Array.dir/array_example.cpp.o
Scanning dependencies of target GridParallel
[ 93%] Building CXX object example/grid_tutorial/09_grid_parallel/CMakeFiles/GridParallel.dir/grid_parallel_example.cpp.o
[ 93%] Built target Partitioner
[ 93%] Linking CXX executable GlobalMesh
Scanning dependencies of target HeffteFFT
[ 93%] Building CXX object example/grid_tutorial/10_fft_heffte/CMakeFiles/HeffteFFT.dir/heffte_fast_fourier_transform_example.cpp.o
[ 93%] Built target GlobalMesh
Scanning dependencies of target StructuredSolver
[ 94%] Building CXX object example/grid_tutorial/11_structured_solver/CMakeFiles/StructuredSolver.dir/structured_solver_example.cpp.o
[ 94%] Linking CXX executable GlobalGrid
[ 94%] Built target GlobalGrid
Scanning dependencies of target Halo
[ 94%] Building CXX object example/grid_tutorial/12_halo/CMakeFiles/Halo.dir/halo_example.cpp.o
[ 94%] Linking CXX executable IndexSpace
[ 94%] Linking CXX executable LocalGrid
[ 94%] Built target IndexSpace
Scanning dependencies of target Spline
[ 94%] Building CXX object example/grid_tutorial/14_spline/CMakeFiles/Spline.dir/spline_example.cpp.o
[ 94%] Built target LocalGrid
[ 94%] Linking CXX executable LocalMesh
Scanning dependencies of target Interpolation
[ 94%] Building CXX object example/grid_tutorial/15_interpolation/CMakeFiles/Interpolation.dir/interpolation_example.cpp.o
[ 95%] Linking CXX executable GridParallel
[ 95%] Built target LocalMesh
[ 95%] Linking CXX executable Array
Scanning dependencies of target LinkedCellPerformance
[ 96%] Building CXX object benchmark/core/CMakeFiles/LinkedCellPerformance.dir/Cabana_LinkedCellPerformance.cpp.o
[ 96%] Built target Array
[ 96%] Built target GridParallel
Scanning dependencies of target NeighborArborXPerformance
Scanning dependencies of target BinSortPerformance
[ 96%] Building CXX object benchmark/core/CMakeFiles/NeighborArborXPerformance.dir/Cabana_NeighborArborXPerformance.cpp.o
[ 97%] Building CXX object benchmark/core/CMakeFiles/BinSortPerformance.dir/Cabana_BinSortPerformance.cpp.o
[ 97%] Linking CXX executable HeffteFFT
[ 97%] Built target HeffteFFT
Scanning dependencies of target NeighborVerletPerformance
[ 97%] Building CXX object benchmark/core/CMakeFiles/NeighborVerletPerformance.dir/Cabana_NeighborVerletPerformance.cpp.o
[ 97%] Linking CXX executable StructuredSolver
[ 97%] Linking CXX executable Halo
[ 97%] Linking CXX executable Spline
[ 97%] Built target StructuredSolver
Scanning dependencies of target CommPerformance
[ 98%] Building CXX object benchmark/core/CMakeFiles/CommPerformance.dir/Cabana_CommPerformance.cpp.o
[ 98%] Built target Halo
[ 98%] Built target Spline
Scanning dependencies of target SparsePartitionerPerformance
Scanning dependencies of target HaloPerformance
[ 98%] Building CXX object benchmark/grid/CMakeFiles/HaloPerformance.dir/Cabana_Grid_HaloPerformance.cpp.o
[ 98%] Building CXX object benchmark/grid/CMakeFiles/SparsePartitionerPerformance.dir/Cabana_Grid_SparsePartitionerPerformance.cpp.o
[ 99%] Linking CXX executable Interpolation
[ 99%] Built target Interpolation
Scanning dependencies of target SparseMapPerformance
[ 99%] Building CXX object benchmark/grid/CMakeFiles/SparseMapPerformance.dir/Cabana_Grid_SparseMapPerformance.cpp.o
[ 99%] Linking CXX executable BinSortPerformance
[ 99%] Linking CXX executable LinkedCellPerformance
[ 99%] Built target BinSortPerformance
Scanning dependencies of target FastFourierTransformPerformance
[ 99%] Building CXX object benchmark/grid/CMakeFiles/FastFourierTransformPerformance.dir/Cabana_Grid_FastFourierTransformPerformance.cpp.o
[ 99%] Built target LinkedCellPerformance
Scanning dependencies of target InterpolationPerformance
[ 99%] Building CXX object benchmark/grid/CMakeFiles/InterpolationPerformance.dir/Cabana_Grid_InterpolationPerformance.cpp.o
[ 99%] Linking CXX executable HaloPerformance
[ 99%] Linking CXX executable NeighborVerletPerformance
[ 99%] Built target HaloPerformance
[ 99%] Built target NeighborVerletPerformance
[ 99%] Linking CXX executable NeighborArborXPerformance
[ 99%] Built target NeighborArborXPerformance
[100%] Linking CXX executable SparsePartitionerPerformance
[100%] Linking CXX executable CommPerformance
[100%] Built target SparsePartitionerPerformance
[100%] Built target CommPerformance
[100%] Linking CXX executable FastFourierTransformPerformance
[100%] Linking CXX executable SparseMapPerformance
[100%] Built target FastFourierTransformPerformance
[100%] Built target SparseMapPerformance
[100%] Linking CXX executable InterpolationPerformance
[100%] Built target InterpolationPerformance
+ ctest --output-on-failure
Test project /var/jenkins/workspace/Cabana_PR-743/build
        Start   1: Cabana_Version_test
  1/191 Test   #1: Cabana_Version_test ..........................................   Passed    0.68 sec
        Start   2: Cabana_Index_test
  2/191 Test   #2: Cabana_Index_test ............................................   Passed    0.66 sec
        Start   3: Cabana_CartesianGrid_test
  3/191 Test   #3: Cabana_CartesianGrid_test ....................................   Passed    0.65 sec
        Start   4: Cabana_SoA_test
  4/191 Test   #4: Cabana_SoA_test ..............................................   Passed    0.66 sec
        Start   5: Cabana_AoSoA_test_SERIAL
  5/191 Test   #5: Cabana_AoSoA_test_SERIAL .....................................   Passed    0.66 sec
        Start   6: Cabana_DeepCopy_test_SERIAL
  6/191 Test   #6: Cabana_DeepCopy_test_SERIAL ..................................   Passed    0.70 sec
        Start   7: Cabana_LinkedCellList_test_SERIAL
  7/191 Test   #7: Cabana_LinkedCellList_test_SERIAL ............................   Passed    0.66 sec
        Start   8: Cabana_NeighborList_test_SERIAL
  8/191 Test   #8: Cabana_NeighborList_test_SERIAL ..............................   Passed    2.39 sec
        Start   9: Cabana_Parallel_test_SERIAL
  9/191 Test   #9: Cabana_Parallel_test_SERIAL ..................................   Passed    0.66 sec
        Start  10: Cabana_ParameterPack_test_SERIAL
 10/191 Test  #10: Cabana_ParameterPack_test_SERIAL .............................   Passed    0.65 sec
        Start  11: Cabana_ParticleInit_test_SERIAL
 11/191 Test  #11: Cabana_ParticleInit_test_SERIAL ..............................   Passed    0.67 sec
        Start  12: Cabana_ParticleList_test_SERIAL
 12/191 Test  #12: Cabana_ParticleList_test_SERIAL ..............................   Passed    0.65 sec
        Start  13: Cabana_Slice_test_SERIAL
 13/191 Test  #13: Cabana_Slice_test_SERIAL .....................................   Passed    0.65 sec
        Start  14: Cabana_Sort_test_SERIAL
 14/191 Test  #14: Cabana_Sort_test_SERIAL ......................................   Passed    0.69 sec
        Start  15: Cabana_Tuple_test_SERIAL
 15/191 Test  #15: Cabana_Tuple_test_SERIAL .....................................   Passed    0.66 sec
        Start  16: Cabana_NeighborListArborX_test_SERIAL
 16/191 Test  #16: Cabana_NeighborListArborX_test_SERIAL ........................   Passed    1.00 sec
        Start  17: Cabana_AoSoA_test_CUDA
 17/191 Test  #17: Cabana_AoSoA_test_CUDA .......................................   Passed    0.66 sec
        Start  18: Cabana_DeepCopy_test_CUDA
 18/191 Test  #18: Cabana_DeepCopy_test_CUDA ....................................   Passed    0.68 sec
        Start  19: Cabana_LinkedCellList_test_CUDA
 19/191 Test  #19: Cabana_LinkedCellList_test_CUDA ..............................   Passed    0.68 sec
        Start  20: Cabana_NeighborList_test_CUDA
 20/191 Test  #20: Cabana_NeighborList_test_CUDA ................................   Passed    0.93 sec
        Start  21: Cabana_Parallel_test_CUDA
 21/191 Test  #21: Cabana_Parallel_test_CUDA ....................................   Passed    0.68 sec
        Start  22: Cabana_ParameterPack_test_CUDA
 22/191 Test  #22: Cabana_ParameterPack_test_CUDA ...............................   Passed    0.69 sec
        Start  23: Cabana_ParticleInit_test_CUDA
 23/191 Test  #23: Cabana_ParticleInit_test_CUDA ................................   Passed    0.86 sec
        Start  24: Cabana_ParticleList_test_CUDA
 24/191 Test  #24: Cabana_ParticleList_test_CUDA ................................   Passed    0.68 sec
        Start  25: Cabana_Slice_test_CUDA
 25/191 Test  #25: Cabana_Slice_test_CUDA .......................................   Passed    0.68 sec
        Start  26: Cabana_Sort_test_CUDA
 26/191 Test  #26: Cabana_Sort_test_CUDA ........................................   Passed    0.73 sec
        Start  27: Cabana_Tuple_test_CUDA
 27/191 Test  #27: Cabana_Tuple_test_CUDA .......................................   Passed    0.68 sec
        Start  28: Cabana_NeighborListArborX_test_CUDA
 28/191 Test  #28: Cabana_NeighborListArborX_test_CUDA ..........................   Passed    0.78 sec
        Start  29: Cabana_AoSoA_test_CUDA_UVM
 29/191 Test  #29: Cabana_AoSoA_test_CUDA_UVM ...................................   Passed    0.68 sec
        Start  30: Cabana_DeepCopy_test_CUDA_UVM
 30/191 Test  #30: Cabana_DeepCopy_test_CUDA_UVM ................................   Passed    0.69 sec
        Start  31: Cabana_LinkedCellList_test_CUDA_UVM
 31/191 Test  #31: Cabana_LinkedCellList_test_CUDA_UVM ..........................   Passed    0.67 sec
        Start  32: Cabana_NeighborList_test_CUDA_UVM
 32/191 Test  #32: Cabana_NeighborList_test_CUDA_UVM ............................   Passed    1.00 sec
        Start  33: Cabana_Parallel_test_CUDA_UVM
 33/191 Test  #33: Cabana_Parallel_test_CUDA_UVM ................................   Passed    0.66 sec
        Start  34: Cabana_ParameterPack_test_CUDA_UVM
 34/191 Test  #34: Cabana_ParameterPack_test_CUDA_UVM ...........................   Passed    0.66 sec
        Start  35: Cabana_ParticleInit_test_CUDA_UVM
 35/191 Test  #35: Cabana_ParticleInit_test_CUDA_UVM ............................   Passed    0.84 sec
        Start  36: Cabana_ParticleList_test_CUDA_UVM
 36/191 Test  #36: Cabana_ParticleList_test_CUDA_UVM ............................   Passed    0.66 sec
        Start  37: Cabana_Slice_test_CUDA_UVM
 37/191 Test  #37: Cabana_Slice_test_CUDA_UVM ...................................   Passed    0.67 sec
        Start  38: Cabana_Sort_test_CUDA_UVM
 38/191 Test  #38: Cabana_Sort_test_CUDA_UVM ....................................   Passed    0.70 sec
        Start  39: Cabana_Tuple_test_CUDA_UVM
 39/191 Test  #39: Cabana_Tuple_test_CUDA_UVM ...................................   Passed    0.67 sec
        Start  40: Cabana_NeighborListArborX_test_CUDA_UVM
 40/191 Test  #40: Cabana_NeighborListArborX_test_CUDA_UVM ......................   Passed    0.82 sec
        Start  41: Cabana_CommunicationPlan_MPI_test_SERIAL_np_1
 41/191 Test  #41: Cabana_CommunicationPlan_MPI_test_SERIAL_np_1 ................   Passed    1.01 sec
        Start  42: Cabana_Distributor_MPI_test_SERIAL_np_1
 42/191 Test  #42: Cabana_Distributor_MPI_test_SERIAL_np_1 ......................   Passed    0.94 sec
        Start  43: Cabana_Halo_MPI_test_SERIAL_np_1
 43/191 Test  #43: Cabana_Halo_MPI_test_SERIAL_np_1 .............................   Passed    0.93 sec
        Start  44: Cabana_CommunicationPlan_MPI_test_CUDA_np_1
 44/191 Test  #44: Cabana_CommunicationPlan_MPI_test_CUDA_np_1 ..................   Passed    0.97 sec
        Start  45: Cabana_Distributor_MPI_test_CUDA_np_1
 45/191 Test  #45: Cabana_Distributor_MPI_test_CUDA_np_1 ........................   Passed    0.99 sec
        Start  46: Cabana_Halo_MPI_test_CUDA_np_1
 46/191 Test  #46: Cabana_Halo_MPI_test_CUDA_np_1 ...............................   Passed    1.05 sec
        Start  47: Cabana_CommunicationPlan_MPI_test_CUDA_UVM_np_1
 47/191 Test  #47: Cabana_CommunicationPlan_MPI_test_CUDA_UVM_np_1 ..............   Passed    0.94 sec
        Start  48: Cabana_Distributor_MPI_test_CUDA_UVM_np_1
 48/191 Test  #48: Cabana_Distributor_MPI_test_CUDA_UVM_np_1 ....................   Passed    0.94 sec
        Start  49: Cabana_Halo_MPI_test_CUDA_UVM_np_1
 49/191 Test  #49: Cabana_Halo_MPI_test_CUDA_UVM_np_1 ...........................   Passed    0.96 sec
        Start  50: Grid_GlobalMesh_MPI_test_SERIAL
 50/191 Test  #50: Grid_GlobalMesh_MPI_test_SERIAL ..............................   Passed    0.66 sec
        Start  51: Grid_IndexSpace_MPI_test_SERIAL
 51/191 Test  #51: Grid_IndexSpace_MPI_test_SERIAL ..............................   Passed    0.68 sec
        Start  52: Grid_SparseIndexSpace_MPI_test_SERIAL
 52/191 Test  #52: Grid_SparseIndexSpace_MPI_test_SERIAL ........................   Passed    0.77 sec
        Start  53: Grid_Splines_MPI_test_SERIAL
 53/191 Test  #53: Grid_Splines_MPI_test_SERIAL .................................   Passed    0.66 sec
        Start  54: Grid_GlobalMesh_MPI_test_CUDA
 54/191 Test  #54: Grid_GlobalMesh_MPI_test_CUDA ................................   Passed    0.66 sec
        Start  55: Grid_IndexSpace_MPI_test_CUDA
 55/191 Test  #55: Grid_IndexSpace_MPI_test_CUDA ................................   Passed    0.67 sec
        Start  56: Grid_SparseIndexSpace_MPI_test_CUDA
 56/191 Test  #56: Grid_SparseIndexSpace_MPI_test_CUDA ..........................   Passed    0.73 sec
        Start  57: Grid_Splines_MPI_test_CUDA
 57/191 Test  #57: Grid_Splines_MPI_test_CUDA ...................................   Passed    0.66 sec
        Start  58: Grid_GlobalMesh_MPI_test_CUDA_UVM
 58/191 Test  #58: Grid_GlobalMesh_MPI_test_CUDA_UVM ............................   Passed    0.66 sec
        Start  59: Grid_IndexSpace_MPI_test_CUDA_UVM
 59/191 Test  #59: Grid_IndexSpace_MPI_test_CUDA_UVM ............................   Passed    0.68 sec
        Start  60: Grid_SparseIndexSpace_MPI_test_CUDA_UVM
 60/191 Test  #60: Grid_SparseIndexSpace_MPI_test_CUDA_UVM ......................   Passed    0.74 sec
        Start  61: Grid_Splines_MPI_test_CUDA_UVM
 61/191 Test  #61: Grid_Splines_MPI_test_CUDA_UVM ...............................   Passed    0.66 sec
        Start  62: Grid_GlobalGrid_MPI_test_SERIAL_np_1
 62/191 Test  #62: Grid_GlobalGrid_MPI_test_SERIAL_np_1 .........................   Passed    0.89 sec
        Start  63: Grid_GlobalParticleComm_MPI_test_SERIAL_np_1
 63/191 Test  #63: Grid_GlobalParticleComm_MPI_test_SERIAL_np_1 .................   Passed    0.87 sec
        Start  64: Grid_LocalGrid_MPI_test_SERIAL_np_1
 64/191 Test  #64: Grid_LocalGrid_MPI_test_SERIAL_np_1 ..........................   Passed    0.86 sec
        Start  65: Grid_IndexConversion_MPI_test_SERIAL_np_1
 65/191 Test  #65: Grid_IndexConversion_MPI_test_SERIAL_np_1 ....................   Passed    2.73 sec
        Start  66: Grid_LocalMesh3d_MPI_test_SERIAL_np_1
 66/191 Test  #66: Grid_LocalMesh3d_MPI_test_SERIAL_np_1 ........................   Passed    2.18 sec
        Start  67: Grid_LocalMesh2d_MPI_test_SERIAL_np_1
 67/191 Test  #67: Grid_LocalMesh2d_MPI_test_SERIAL_np_1 ........................   Passed    0.88 sec
        Start  68: Grid_Array3d_MPI_test_SERIAL_np_1
 68/191 Test  #68: Grid_Array3d_MPI_test_SERIAL_np_1 ............................   Passed    0.98 sec
        Start  69: Grid_Array2d_MPI_test_SERIAL_np_1
 69/191 Test  #69: Grid_Array2d_MPI_test_SERIAL_np_1 ............................   Passed    0.86 sec
        Start  70: Grid_Halo3d_MPI_test_SERIAL_np_1
 70/191 Test  #70: Grid_Halo3d_MPI_test_SERIAL_np_1 .............................   Passed    1.87 sec
        Start  71: Grid_Halo2d_MPI_test_SERIAL_np_1
 71/191 Test  #71: Grid_Halo2d_MPI_test_SERIAL_np_1 .............................   Passed    0.89 sec
        Start  72: Grid_ParticleInit_MPI_test_SERIAL_np_1
 72/191 Test  #72: Grid_ParticleInit_MPI_test_SERIAL_np_1 .......................   Passed    1.18 sec
        Start  73: Grid_ParticleGridDistributor2d_MPI_test_SERIAL_np_1
 73/191 Test  #73: Grid_ParticleGridDistributor2d_MPI_test_SERIAL_np_1 ..........   Passed    0.92 sec
        Start  74: Grid_ParticleGridDistributor3d_MPI_test_SERIAL_np_1
 74/191 Test  #74: Grid_ParticleGridDistributor3d_MPI_test_SERIAL_np_1 ..........   Passed    0.98 sec
        Start  75: Grid_SplineEvaluation3d_MPI_test_SERIAL_np_1
 75/191 Test  #75: Grid_SplineEvaluation3d_MPI_test_SERIAL_np_1 .................   Passed    1.63 sec
        Start  76: Grid_SplineEvaluation2d_MPI_test_SERIAL_np_1
 76/191 Test  #76: Grid_SplineEvaluation2d_MPI_test_SERIAL_np_1 .................   Passed    0.89 sec
        Start  77: Grid_Interpolation3d_MPI_test_SERIAL_np_1
 77/191 Test  #77: Grid_Interpolation3d_MPI_test_SERIAL_np_1 ....................   Passed    1.07 sec
        Start  78: Grid_Interpolation2d_MPI_test_SERIAL_np_1
 78/191 Test  #78: Grid_Interpolation2d_MPI_test_SERIAL_np_1 ....................   Passed    0.90 sec
        Start  79: Grid_BovWriter_MPI_test_SERIAL_np_1
 79/191 Test  #79: Grid_BovWriter_MPI_test_SERIAL_np_1 ..........................   Passed    0.94 sec
        Start  80: Grid_Parallel_MPI_test_SERIAL_np_1
 80/191 Test  #80: Grid_Parallel_MPI_test_SERIAL_np_1 ...........................   Passed    0.97 sec
        Start  81: Grid_Partitioner_MPI_test_SERIAL_np_1
 81/191 Test  #81: Grid_Partitioner_MPI_test_SERIAL_np_1 ........................   Passed    0.87 sec
        Start  82: Grid_ParticleList_MPI_test_SERIAL_np_1
 82/191 Test  #82: Grid_ParticleList_MPI_test_SERIAL_np_1 .......................   Passed    0.85 sec
        Start  83: Grid_SparseArray_MPI_test_SERIAL_np_1
 83/191 Test  #83: Grid_SparseArray_MPI_test_SERIAL_np_1 ........................   Passed    1.01 sec
        Start  84: Grid_SparseDimPartitioner_MPI_test_SERIAL_np_1
 84/191 Test  #84: Grid_SparseDimPartitioner_MPI_test_SERIAL_np_1 ...............   Passed    0.92 sec
        Start  85: Grid_SparseHalo_MPI_test_SERIAL_np_1
 85/191 Test  #85: Grid_SparseHalo_MPI_test_SERIAL_np_1 .........................   Passed    0.86 sec
        Start  86: Grid_SparseLocalGrid_MPI_test_SERIAL_np_1
 86/191 Test  #86: Grid_SparseLocalGrid_MPI_test_SERIAL_np_1 ....................   Passed    0.86 sec
        Start  87: Grid_HypreStructuredSolver3d_MPI_test_SERIAL_np_1
 87/191 Test  #87: Grid_HypreStructuredSolver3d_MPI_test_SERIAL_np_1 ............   Passed    0.85 sec
        Start  88: Grid_HypreStructuredSolver2d_MPI_test_SERIAL_np_1
 88/191 Test  #88: Grid_HypreStructuredSolver2d_MPI_test_SERIAL_np_1 ............   Passed    0.85 sec
        Start  89: Grid_HypreSemiStructuredSolver_MPI_test_SERIAL_np_1
 89/191 Test  #89: Grid_HypreSemiStructuredSolver_MPI_test_SERIAL_np_1 ..........   Passed    0.86 sec
        Start  90: Grid_HypreSemiStructuredSolverMulti_MPI_test_SERIAL_np_1
 90/191 Test  #90: Grid_HypreSemiStructuredSolverMulti_MPI_test_SERIAL_np_1 .....   Passed    0.86 sec
        Start  91: Grid_FastFourierTransform_MPI_test_SERIAL_np_1
 91/191 Test  #91: Grid_FastFourierTransform_MPI_test_SERIAL_np_1 ...............   Passed    0.92 sec
        Start  92: Grid_GlobalGrid_MPI_test_CUDA_np_1
 92/191 Test  #92: Grid_GlobalGrid_MPI_test_CUDA_np_1 ...........................   Passed    0.87 sec
        Start  93: Grid_GlobalParticleComm_MPI_test_CUDA_np_1
 93/191 Test  #93: Grid_GlobalParticleComm_MPI_test_CUDA_np_1 ...................   Passed    0.90 sec
        Start  94: Grid_LocalGrid_MPI_test_CUDA_np_1
 94/191 Test  #94: Grid_LocalGrid_MPI_test_CUDA_np_1 ............................   Passed    0.85 sec
        Start  95: Grid_IndexConversion_MPI_test_CUDA_np_1
 95/191 Test  #95: Grid_IndexConversion_MPI_test_CUDA_np_1 ......................   Passed    2.00 sec
        Start  96: Grid_LocalMesh3d_MPI_test_CUDA_np_1
 96/191 Test  #96: Grid_LocalMesh3d_MPI_test_CUDA_np_1 ..........................   Passed    2.15 sec
        Start  97: Grid_LocalMesh2d_MPI_test_CUDA_np_1
 97/191 Test  #97: Grid_LocalMesh2d_MPI_test_CUDA_np_1 ..........................   Passed    0.88 sec
        Start  98: Grid_Array3d_MPI_test_CUDA_np_1
 98/191 Test  #98: Grid_Array3d_MPI_test_CUDA_np_1 ..............................   Passed    0.93 sec
        Start  99: Grid_Array2d_MPI_test_CUDA_np_1
 99/191 Test  #99: Grid_Array2d_MPI_test_CUDA_np_1 ..............................   Passed    1.00 sec
        Start 100: Grid_Halo3d_MPI_test_CUDA_np_1
100/191 Test #100: Grid_Halo3d_MPI_test_CUDA_np_1 ...............................   Passed    1.68 sec
        Start 101: Grid_Halo2d_MPI_test_CUDA_np_1
101/191 Test #101: Grid_Halo2d_MPI_test_CUDA_np_1 ...............................   Passed    1.11 sec
        Start 102: Grid_ParticleInit_MPI_test_CUDA_np_1
102/191 Test #102: Grid_ParticleInit_MPI_test_CUDA_np_1 .........................   Passed    1.29 sec
        Start 103: Grid_ParticleGridDistributor2d_MPI_test_CUDA_np_1
103/191 Test #103: Grid_ParticleGridDistributor2d_MPI_test_CUDA_np_1 ............   Passed    1.03 sec
        Start 104: Grid_ParticleGridDistributor3d_MPI_test_CUDA_np_1
104/191 Test #104: Grid_ParticleGridDistributor3d_MPI_test_CUDA_np_1 ............   Passed    1.04 sec
        Start 105: Grid_SplineEvaluation3d_MPI_test_CUDA_np_1
105/191 Test #105: Grid_SplineEvaluation3d_MPI_test_CUDA_np_1 ...................   Passed    1.71 sec
        Start 106: Grid_SplineEvaluation2d_MPI_test_CUDA_np_1
106/191 Test #106: Grid_SplineEvaluation2d_MPI_test_CUDA_np_1 ...................   Passed    0.90 sec
        Start 107: Grid_Interpolation3d_MPI_test_CUDA_np_1
107/191 Test #107: Grid_Interpolation3d_MPI_test_CUDA_np_1 ......................   Passed    0.93 sec
        Start 108: Grid_Interpolation2d_MPI_test_CUDA_np_1
108/191 Test #108: Grid_Interpolation2d_MPI_test_CUDA_np_1 ......................   Passed    0.88 sec
        Start 109: Grid_BovWriter_MPI_test_CUDA_np_1
109/191 Test #109: Grid_BovWriter_MPI_test_CUDA_np_1 ............................   Passed    0.95 sec
        Start 110: Grid_Parallel_MPI_test_CUDA_np_1
110/191 Test #110: Grid_Parallel_MPI_test_CUDA_np_1 .............................   Passed    0.95 sec
        Start 111: Grid_Partitioner_MPI_test_CUDA_np_1
111/191 Test #111: Grid_Partitioner_MPI_test_CUDA_np_1 ..........................   Passed    0.86 sec
        Start 112: Grid_ParticleList_MPI_test_CUDA_np_1
112/191 Test #112: Grid_ParticleList_MPI_test_CUDA_np_1 .........................   Passed    0.87 sec
        Start 113: Grid_SparseArray_MPI_test_CUDA_np_1
113/191 Test #113: Grid_SparseArray_MPI_test_CUDA_np_1 ..........................   Passed    0.92 sec
        Start 114: Grid_SparseDimPartitioner_MPI_test_CUDA_np_1
114/191 Test #114: Grid_SparseDimPartitioner_MPI_test_CUDA_np_1 .................   Passed    0.96 sec
        Start 115: Grid_SparseHalo_MPI_test_CUDA_np_1
115/191 Test #115: Grid_SparseHalo_MPI_test_CUDA_np_1 ...........................   Passed    0.87 sec
        Start 116: Grid_SparseLocalGrid_MPI_test_CUDA_np_1
116/191 Test #116: Grid_SparseLocalGrid_MPI_test_CUDA_np_1 ......................   Passed    0.85 sec
        Start 117: Grid_HypreStructuredSolver3d_MPI_test_CUDA_np_1
117/191 Test #117: Grid_HypreStructuredSolver3d_MPI_test_CUDA_np_1 ..............   Passed    1.27 sec
        Start 118: Grid_HypreStructuredSolver2d_MPI_test_CUDA_np_1
118/191 Test #118: Grid_HypreStructuredSolver2d_MPI_test_CUDA_np_1 ..............   Passed    1.46 sec
        Start 119: Grid_HypreSemiStructuredSolver_MPI_test_CUDA_np_1
119/191 Test #119: Grid_HypreSemiStructuredSolver_MPI_test_CUDA_np_1 ............   Passed    1.31 sec
        Start 120: Grid_HypreSemiStructuredSolverMulti_MPI_test_CUDA_np_1
120/191 Test #120: Grid_HypreSemiStructuredSolverMulti_MPI_test_CUDA_np_1 .......   Passed    1.45 sec
        Start 121: Grid_FastFourierTransform_MPI_test_CUDA_np_1
121/191 Test #121: Grid_FastFourierTransform_MPI_test_CUDA_np_1 .................   Passed    2.73 sec
        Start 122: Grid_GlobalGrid_MPI_test_CUDA_UVM_np_1
122/191 Test #122: Grid_GlobalGrid_MPI_test_CUDA_UVM_np_1 .......................   Passed    0.87 sec
        Start 123: Grid_GlobalParticleComm_MPI_test_CUDA_UVM_np_1
123/191 Test #123: Grid_GlobalParticleComm_MPI_test_CUDA_UVM_np_1 ...............   Passed    0.90 sec
        Start 124: Grid_LocalGrid_MPI_test_CUDA_UVM_np_1
124/191 Test #124: Grid_LocalGrid_MPI_test_CUDA_UVM_np_1 ........................   Passed    0.86 sec
        Start 125: Grid_IndexConversion_MPI_test_CUDA_UVM_np_1
125/191 Test #125: Grid_IndexConversion_MPI_test_CUDA_UVM_np_1 ..................   Passed    2.22 sec
        Start 126: Grid_LocalMesh3d_MPI_test_CUDA_UVM_np_1
126/191 Test #126: Grid_LocalMesh3d_MPI_test_CUDA_UVM_np_1 ......................   Passed    2.15 sec
        Start 127: Grid_LocalMesh2d_MPI_test_CUDA_UVM_np_1
127/191 Test #127: Grid_LocalMesh2d_MPI_test_CUDA_UVM_np_1 ......................   Passed    0.89 sec
        Start 128: Grid_Array3d_MPI_test_CUDA_UVM_np_1
128/191 Test #128: Grid_Array3d_MPI_test_CUDA_UVM_np_1 ..........................   Passed    0.95 sec
        Start 129: Grid_Array2d_MPI_test_CUDA_UVM_np_1
129/191 Test #129: Grid_Array2d_MPI_test_CUDA_UVM_np_1 ..........................   Passed    0.88 sec
        Start 130: Grid_Halo3d_MPI_test_CUDA_UVM_np_1
130/191 Test #130: Grid_Halo3d_MPI_test_CUDA_UVM_np_1 ...........................   Passed    1.67 sec
        Start 131: Grid_Halo2d_MPI_test_CUDA_UVM_np_1
131/191 Test #131: Grid_Halo2d_MPI_test_CUDA_UVM_np_1 ...........................   Passed    0.94 sec
        Start 132: Grid_ParticleInit_MPI_test_CUDA_UVM_np_1
132/191 Test #132: Grid_ParticleInit_MPI_test_CUDA_UVM_np_1 .....................   Passed    1.06 sec
        Start 133: Grid_ParticleGridDistributor2d_MPI_test_CUDA_UVM_np_1
133/191 Test #133: Grid_ParticleGridDistributor2d_MPI_test_CUDA_UVM_np_1 ........   Passed    0.91 sec
        Start 134: Grid_ParticleGridDistributor3d_MPI_test_CUDA_UVM_np_1
134/191 Test #134: Grid_ParticleGridDistributor3d_MPI_test_CUDA_UVM_np_1 ........   Passed    0.96 sec
        Start 135: Grid_SplineEvaluation3d_MPI_test_CUDA_UVM_np_1
135/191 Test #135: Grid_SplineEvaluation3d_MPI_test_CUDA_UVM_np_1 ...............   Passed    1.56 sec
        Start 136: Grid_SplineEvaluation2d_MPI_test_CUDA_UVM_np_1
136/191 Test #136: Grid_SplineEvaluation2d_MPI_test_CUDA_UVM_np_1 ...............   Passed    0.90 sec
        Start 137: Grid_Interpolation3d_MPI_test_CUDA_UVM_np_1
137/191 Test #137: Grid_Interpolation3d_MPI_test_CUDA_UVM_np_1 ..................   Passed    0.98 sec
        Start 138: Grid_Interpolation2d_MPI_test_CUDA_UVM_np_1
138/191 Test #138: Grid_Interpolation2d_MPI_test_CUDA_UVM_np_1 ..................   Passed    0.88 sec
        Start 139: Grid_BovWriter_MPI_test_CUDA_UVM_np_1
139/191 Test #139: Grid_BovWriter_MPI_test_CUDA_UVM_np_1 ........................   Passed    0.98 sec
        Start 140: Grid_Parallel_MPI_test_CUDA_UVM_np_1
140/191 Test #140: Grid_Parallel_MPI_test_CUDA_UVM_np_1 .........................   Passed    0.96 sec
        Start 141: Grid_Partitioner_MPI_test_CUDA_UVM_np_1
141/191 Test #141: Grid_Partitioner_MPI_test_CUDA_UVM_np_1 ......................   Passed    0.87 sec
        Start 142: Grid_ParticleList_MPI_test_CUDA_UVM_np_1
142/191 Test #142: Grid_ParticleList_MPI_test_CUDA_UVM_np_1 .....................   Passed    0.85 sec
        Start 143: Grid_SparseArray_MPI_test_CUDA_UVM_np_1
143/191 Test #143: Grid_SparseArray_MPI_test_CUDA_UVM_np_1 ......................   Passed    0.95 sec
        Start 144: Grid_SparseDimPartitioner_MPI_test_CUDA_UVM_np_1
144/191 Test #144: Grid_SparseDimPartitioner_MPI_test_CUDA_UVM_np_1 .............   Passed    1.00 sec
        Start 145: Grid_SparseHalo_MPI_test_CUDA_UVM_np_1
145/191 Test #145: Grid_SparseHalo_MPI_test_CUDA_UVM_np_1 .......................   Passed    0.90 sec
        Start 146: Grid_SparseLocalGrid_MPI_test_CUDA_UVM_np_1
146/191 Test #146: Grid_SparseLocalGrid_MPI_test_CUDA_UVM_np_1 ..................   Passed    0.89 sec
        Start 147: Grid_HypreStructuredSolver3d_MPI_test_CUDA_UVM_np_1
147/191 Test #147: Grid_HypreStructuredSolver3d_MPI_test_CUDA_UVM_np_1 ..........   Passed    0.88 sec
        Start 148: Grid_HypreStructuredSolver2d_MPI_test_CUDA_UVM_np_1
148/191 Test #148: Grid_HypreStructuredSolver2d_MPI_test_CUDA_UVM_np_1 ..........   Passed    0.88 sec
        Start 149: Grid_HypreSemiStructuredSolver_MPI_test_CUDA_UVM_np_1
149/191 Test #149: Grid_HypreSemiStructuredSolver_MPI_test_CUDA_UVM_np_1 ........   Passed    0.89 sec
        Start 150: Grid_HypreSemiStructuredSolverMulti_MPI_test_CUDA_UVM_np_1
150/191 Test #150: Grid_HypreSemiStructuredSolverMulti_MPI_test_CUDA_UVM_np_1 ...   Passed    0.89 sec
        Start 151: Grid_FastFourierTransform_MPI_test_CUDA_UVM_np_1
151/191 Test #151: Grid_FastFourierTransform_MPI_test_CUDA_UVM_np_1 .............   Passed    1.77 sec
        Start 152: Core_tutorial_01
152/191 Test #152: Core_tutorial_01 .............................................   Passed    0.67 sec
        Start 153: Core_tutorial_02
153/191 Test #153: Core_tutorial_02 .............................................   Passed    0.66 sec
        Start 154: Core_tutorial_03
154/191 Test #154: Core_tutorial_03 .............................................   Passed    0.67 sec
        Start 155: Core_tutorial_04_unmanaged
155/191 Test #155: Core_tutorial_04_unmanaged ...................................   Passed    0.71 sec
        Start 156: Core_tutorial_04
156/191 Test #156: Core_tutorial_04 .............................................   Passed    0.71 sec
        Start 157: Core_tutorial_05
157/191 Test #157: Core_tutorial_05 .............................................   Passed    0.71 sec
        Start 158: Core_tutorial_06
158/191 Test #158: Core_tutorial_06 .............................................   Passed    0.69 sec
        Start 159: Core_tutorial_07
159/191 Test #159: Core_tutorial_07 .............................................   Passed    0.67 sec
        Start 160: Core_tutorial_08
160/191 Test #160: Core_tutorial_08 .............................................   Passed    0.68 sec
        Start 161: Core_tutorial_09
161/191 Test #161: Core_tutorial_09 .............................................   Passed    0.71 sec
        Start 162: Core_tutorial_09_arborx
162/191 Test #162: Core_tutorial_09_arborx ......................................   Passed    0.72 sec
        Start 163: Core_tutorial_10_neighbor
163/191 Test #163: Core_tutorial_10_neighbor ....................................   Passed    0.71 sec
        Start 164: Core_tutorial_10_simd
164/191 Test #164: Core_tutorial_10_simd ........................................   Passed    0.72 sec
        Start 165: Core_tutorial_11
165/191 Test #165: Core_tutorial_11 .............................................   Passed    0.91 sec
        Start 166: Core_tutorial_12
166/191 Test #166: Core_tutorial_12 .............................................   Passed    0.88 sec
        Start 167: Core_tutorial_05_cuda
167/191 Test #167: Core_tutorial_05_cuda ........................................   Passed    0.67 sec
        Start 168: Grid_tutorial_01
168/191 Test #168: Grid_tutorial_01 .............................................   Passed    0.71 sec
        Start 169: Grid_tutorial_02
169/191 Test #169: Grid_tutorial_02 .............................................   Passed    0.71 sec
        Start 170: Grid_tutorial_03
170/191 Test #170: Grid_tutorial_03 .............................................   Passed    0.89 sec
        Start 171: Grid_tutorial_04
171/191 Test #171: Grid_tutorial_04 .............................................   Passed    0.88 sec
        Start 172: Grid_tutorial_05
172/191 Test #172: Grid_tutorial_05 .............................................   Passed    0.67 sec
        Start 173: Grid_tutorial_06
173/191 Test #173: Grid_tutorial_06 .............................................   Passed    0.87 sec
        Start 174: Grid_tutorial_07
174/191 Test #174: Grid_tutorial_07 .............................................   Passed    0.89 sec
        Start 175: Grid_tutorial_08
175/191 Test #175: Grid_tutorial_08 .............................................   Passed    0.92 sec
        Start 176: Grid_tutorial_09
176/191 Test #176: Grid_tutorial_09 .............................................   Passed    1.00 sec
        Start 177: Grid_tutorial_10
177/191 Test #177: Grid_tutorial_10 .............................................   Passed    0.87 sec
        Start 178: Grid_tutorial_11
178/191 Test #178: Grid_tutorial_11 .............................................   Passed    0.99 sec
        Start 179: Grid_tutorial_12
179/191 Test #179: Grid_tutorial_12 .............................................   Passed    0.87 sec
        Start 180: Grid_tutorial_14
180/191 Test #180: Grid_tutorial_14 .............................................   Passed    0.86 sec
        Start 181: Grid_tutorial_15
181/191 Test #181: Grid_tutorial_15 .............................................   Passed    0.86 sec
        Start 182: Cabana_Performance_BinSort
182/191 Test #182: Cabana_Performance_BinSort ...................................   Passed    1.02 sec
        Start 183: Cabana_Performance_NeighborVerlet
183/191 Test #183: Cabana_Performance_NeighborVerlet ............................   Passed    1.23 sec
        Start 184: Cabana_Performance_NeighborArborX
184/191 Test #184: Cabana_Performance_NeighborArborX ............................   Passed    1.74 sec
        Start 185: Cabana_Performance_LinkedCell
185/191 Test #185: Cabana_Performance_LinkedCell ................................   Passed    0.71 sec
        Start 186: Cabana_Performance_Comm
186/191 Test #186: Cabana_Performance_Comm ......................................   Passed    1.53 sec
        Start 187: Cabana_Grid_Performance_SparseMap
187/191 Test #187: Cabana_Grid_Performance_SparseMap ............................   Passed    0.88 sec
        Start 188: Cabana_Grid_Performance_SparsePartitioner
188/191 Test #188: Cabana_Grid_Performance_SparsePartitioner ....................   Passed    1.80 sec
        Start 189: Cabana_Grid_Performance_Halo
189/191 Test #189: Cabana_Grid_Performance_Halo .................................   Passed    0.87 sec
        Start 190: Cabana_Grid_Performance_Interpolation
190/191 Test #190: Cabana_Grid_Performance_Interpolation ........................   Passed   37.41 sec
        Start 191: Cabana_Grid_Performance_FastFourierTransform
191/191 Test #191: Cabana_Grid_Performance_FastFourierTransform .................   Passed    2.09 sec

100% tests passed, 0 tests failed out of 191

Total Test time (real) = 219.95 sec
Post stage
[Pipeline] sh
+ ccache --show-stats
cache directory                     /tmp/ccache
primary config                      /tmp/ccache/ccache.conf
secondary config      (readonly)    /etc/ccache.conf
stats updated                       Thu Mar 21 20:04:42 2024
stats zeroed                        Thu Mar 21 19:54:15 2024
cache hit (direct)                   130
cache hit (preprocessed)               8
cache miss                           204
cache hit rate                     40.35 %
cleanups performed                     0
files in cache                      4174
cache size                           6.2 GB
max cache size                      10.0 GB
[Pipeline] }
$ docker stop --time=1 7f3e2cf1b4fdb51ff8bc8f4361b02a7b6af716ea9f03ba03564695daaaa6242c
$ docker rm -f --volumes 7f3e2cf1b4fdb51ff8bc8f4361b02a7b6af716ea9f03ba03564695daaaa6242c
[Pipeline] // withDockerContainer
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
Cancelling nested steps due to timeout
[Pipeline] // node
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
Failed in branch ROCM-5.2-HIPCC-DEBUG
[Pipeline] // parallel
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] End of Pipeline
ERROR: script returned exit code 2

GitHub has been notified of this commit’s build result

Finished: FAILURE