What happened?
Hi,
I'm starting to work on Jupiter, and I tried to submit a slurm job with ../WeatherGenerator-private/hpc/launch-slurm.py --stage inference --from-run-id fbclk6da --config ./config/inference/inference_era_o96_cerra_config.yml --account=e-ext-2025e01-128 and I get this error:
sbatch: error: Batch job submission failed: Invalid account or account/partition combination specified
Actually, I have access to Jupiter through the "e-ext-2025e01-128" project. However, it seems that "weatherai" account is used by default for the cleanup job, and the --account=e-ext-2025e01-128 is not taken into account for it.
In my .bashrc file I have: jutil env activate -p e-ext-2025e01-128
What are the steps to reproduce the bug?
No response
Hedgedoc link to logs and more information. This ticket is public, do not attach files directly.
No response
What happened?
Hi,
I'm starting to work on Jupiter, and I tried to submit a slurm job with
../WeatherGenerator-private/hpc/launch-slurm.py --stage inference --from-run-id fbclk6da --config ./config/inference/inference_era_o96_cerra_config.yml --account=e-ext-2025e01-128and I get this error:sbatch: error: Batch job submission failed: Invalid account or account/partition combination specifiedActually, I have access to Jupiter through the "e-ext-2025e01-128" project. However, it seems that "weatherai" account is used by default for the cleanup job, and the
--account=e-ext-2025e01-128is not taken into account for it.In my .bashrc file I have:
jutil env activate -p e-ext-2025e01-128What are the steps to reproduce the bug?
No response
Hedgedoc link to logs and more information. This ticket is public, do not attach files directly.
No response