Skip to content

Conversation

@laraPPr
Copy link
Collaborator

@laraPPr laraPPr commented Dec 12, 2025

No description provided.

@laraPPr laraPPr added the 2025.06-software.eessi.io 2025.06 version of software.eessi.io label Dec 12, 2025
@laraPPr
Copy link
Collaborator Author

laraPPr commented Dec 12, 2025

bot: build repo:eessi.io-2025.06-software instance:eessi-bot-mc-aws for:arch=aarch64/neoverse_n1

@eessi-bot-aws
Copy link

eessi-bot-aws bot commented Dec 12, 2025

New job on instance eessi-bot-mc-aws for repository eessi.io-2025.06-software
Building on: neoverse_n1
Building for: aarch64/neoverse_n1
Job dir: /project/def-users/SHARED/jobs/2025.12/pr_1337/112386

date job status comment
Dec 12 15:11:08 UTC 2025 submitted job id 112386 awaits release by job manager
Dec 12 15:11:49 UTC 2025 released job awaits launch by Slurm scheduler
Dec 12 15:17:58 UTC 2025 running job 112386 is running
Dec 12 16:15:36 UTC 2025 finished
😢 FAILURE (click triangle for details)
Details
✅ job output file slurm-112386.out
✅ no message matching FATAL:
❌ found message matching ERROR:
❌ found message matching FAILED:
❌ found message matching required modules missing:
❌ no message matching No missing installations
✅ found message matching .tar.* created!
Artefacts
eessi-2025.06-software-linux-aarch64-neoverse_n1-17655559860.tar.zstsize: 44 MiB (46436578 bytes)
entries: 1832
modules under 2025.06/software/linux/aarch64/neoverse_n1/modules/all
archspec/0.2.5-GCCcore-13.3.0.lua
kim-api/2.4.1-GCC-13.3.0.lua
MDI/1.4.26-gompi-2024a.lua
mpi4py/4.0.1-gompi-2024a.lua
PLUMED/2.9.3-foss-2024a.lua
ScaFaCoS/1.0.4-foss-2024a.lua
tbb/2021.13.0-GCCcore-13.3.0.lua
Voro++/0.4.6-GCCcore-13.3.0.lua
xxd/9.1.1275-GCCcore-13.3.0.lua
software under 2025.06/software/linux/aarch64/neoverse_n1/software
archspec/0.2.5-GCCcore-13.3.0
kim-api/2.4.1-GCC-13.3.0
MDI/1.4.26-gompi-2024a
mpi4py/4.0.1-gompi-2024a
PLUMED/2.9.3-foss-2024a
ScaFaCoS/1.0.4-foss-2024a
tbb/2021.13.0-GCCcore-13.3.0
Voro++/0.4.6-GCCcore-13.3.0
xxd/9.1.1275-GCCcore-13.3.0
reprod directories under 2025.06/software/linux/aarch64/neoverse_n1/reprod
archspec/0.2.5-GCCcore-13.3.0/20251212_151907UTC
kim-api/2.4.1-GCC-13.3.0/20251212_152142UTC
MDI/1.4.26-gompi-2024a/20251212_152231UTC
mpi4py/4.0.1-gompi-2024a/20251212_153657UTC
PLUMED/2.9.3-foss-2024a/20251212_152955UTC
ScaFaCoS/1.0.4-foss-2024a/20251212_155254UTC
tbb/2021.13.0-GCCcore-13.3.0/20251212_154053UTC
Voro++/0.4.6-GCCcore-13.3.0/20251212_151924UTC
xxd/9.1.1275-GCCcore-13.3.0/20251212_152237UTC
other under 2025.06/software/linux/aarch64/neoverse_n1
no other files in tarball
Dec 12 16:15:36 UTC 2025 test result
😁 SUCCESS (click triangle for details)
ReFrame Summary
[ OK ] (1/4) EESSI_OSU_coll %benchmark_info=mpi.collective.osu_allreduce %module_name=OSU-Micro-Benchmarks/7.5-gompi-2025a %scale=1_node %device_type=cpu /e4bf9965 @BotBuildTests:aarch64_neoverse_n1+default
P: latency: 1.94 us (r:0, l:None, u:None)
[ OK ] (2/4) EESSI_OSU_coll %benchmark_info=mpi.collective.osu_alltoall %module_name=OSU-Micro-Benchmarks/7.5-gompi-2025a %scale=1_node %device_type=cpu /3da4890b @BotBuildTests:aarch64_neoverse_n1+default
P: latency: 5.46 us (r:0, l:None, u:None)
[ OK ] (3/4) EESSI_OSU_pt2pt_CPU %benchmark_info=mpi.pt2pt.osu_latency %module_name=OSU-Micro-Benchmarks/7.5-gompi-2025a %scale=1_node /3255009a @BotBuildTests:aarch64_neoverse_n1+default
P: latency: 0.29 us (r:0, l:None, u:None)
[ OK ] (4/4) EESSI_OSU_pt2pt_CPU %benchmark_info=mpi.pt2pt.osu_bw %module_name=OSU-Micro-Benchmarks/7.5-gompi-2025a %scale=1_node /59f4b331 @BotBuildTests:aarch64_neoverse_n1+default
P: bandwidth: 15717.66 MB/s (r:0, l:None, u:None)
[ PASSED ] Ran 4/4 test case(s) from 4 check(s) (0 failure(s), 0 skipped, 0 aborted)
Details
✅ job output file slurm-112386.out
❌ found message matching ERROR:
✅ no message matching [\s*FAILED\s*].*Ran .* test case

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

2025.06-software.eessi.io 2025.06 version of software.eessi.io

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant