-
Notifications
You must be signed in to change notification settings - Fork 160
ARCHER2
Previously a script based installation has been used to install Firedrake on ARCHER2, this is no longer recommended and we now encourage you to use Spack. (The old repository is available here)
We will work with the in a directory /work/[budget code]/[budget code]/[$USER]/workspace
, where [budget code]
is replaced by your project code and [$USER]
is your username.
You should familiarise yourself with as much of Spack's functionality as possible. To achieve the same flexibility as we have on desktop installs the Firedrake Spack install leverages a lot of Spack's features. More information about Spack can be found on their website.
Start in a directory on the WORK filesystem, not HOME as it is not accessible from the compute nodes. You may wish to set SPACK_USER_CONFIG_PATH
which defaults to ~/.spack
to change where spack looks for user level configuration files. Changing SPACK_USER_CONFIG_PATH
isn't necessary if everything is correctly configured within the spack environment.
We recommend using the GNU toolchain and GCC compilers at version 10.2.0, see additional notes for more info.
The following clones the Spack repo, loads modules to setup the environment and activates Spack.
git clone -c feature.manyFiles=true [email protected]:spack/spack.git
module purge
module load load-epcc-module
# Load programming environment and compiler
module load PrgEnv-gnu/8.1.0
module swap gcc gcc/10.2.0
module swap cray-mpich cray-mpich/8.1.9
module swap cray-dsmml cray-dsmml/0.2.1
module swap cray-libsci cray-libsci/21.08.1.2
module load xpmem
module load cmake/3.21.3
module load cray-python/3.9.4.1
. spack/share/spack/setup-env.sh
If you want to use a different BLAS and LAPACK and Scalapack library you don't need to load cray-libsci and you should run module unload cray-libsci
. For instance the AMD optimised CPU libraries (AOCL) could be loaded instead module load aocl/3.1
If you really want to try the other compilers replace the two isolated lines under # Load programming environment and compiler
with
module load PrgEnv-aocc/8.1.0
module swap aocc aocc/3.0.0
# or
module load PrgEnv-cray/8.1.0
module swap cce cce/12.0.3
for the AMD or Cray compilers respectively.
- Populate
$SPACK_USER_CONFIG_PATH/cray/compilers.yaml
with the following YAML. Note that this contains most of the available compilers, you only need to configure for the compilers you will use.
$SPACK_USER_CONFIG_PATH/cray/compilers.yaml
compilers:
- compiler:
spec: [email protected]
paths:
cc: cc
cxx: CC
f77: ftn
fc: ftn
flags:
cflags: null
cxxflags: null
fflags: null
operating_system: sles15
target: any
modules:
- PrgEnv-aocc
- aocc/2.2.0
environment: {}
extra_rpaths: []
- compiler:
spec: [email protected]
paths:
cc: cc
cxx: CC
f77: ftn
fc: ftn
flags:
cflags: null
cxxflags: null
fflags: null
operating_system: sles15
target: any
modules:
- PrgEnv-aocc
- aocc/2.2.0.1
environment: {}
extra_rpaths: []
- compiler:
spec: [email protected]
paths:
cc: cc
cxx: CC
f77: ftn
fc: ftn
flags:
cflags: -Wno-unused-command-line-argument -mllvm -eliminate-similar-expr=false
cxxflags: -Wno-unused-command-line-argument -mllvm -eliminate-similar-expr=false
fflags: -Wno-unused-command-line-argument -mllvm -eliminate-similar-expr=false
ldflags: -Wl,-rpath,/opt/cray/libfabric/1.11.0.4.71/lib64 -L/opt/cray/libfabric/1.11.0.4.71/lib64 -lfabric
operating_system: sles15
target: any
modules:
- PrgEnv-aocc
- aocc/3.0.0
environment: {}
extra_rpaths: []
- compiler:
spec: [email protected]
paths:
cc: cc
cxx: CC
f77: ftn
fc: ftn
flags: {}
operating_system: sles15
target: any
modules:
- PrgEnv-cray
- cce/11.0.4
environment: {}
extra_rpaths: []
- compiler:
spec: [email protected]
paths:
cc: cc
cxx: CC
f77: ftn
fc: ftn
flags: {}
operating_system: sles15
target: any
modules:
- PrgEnv-cray
- cce/12.0.3
environment: {}
extra_rpaths: []
- compiler:
spec: [email protected]
paths:
cc: cc
cxx: CC
f77: ftn
fc: ftn
flags: {}
operating_system: sles15
target: any
modules:
- PrgEnv-gnu
- gcc/9.3.0
environment: {}
extra_rpaths: []
- compiler:
spec: [email protected]
paths:
cc: cc
cxx: CC
f77: ftn
fc: ftn
flags: {}
operating_system: sles15
target: any
modules:
- PrgEnv-gnu
- gcc/10.2.0
environment: {}
extra_rpaths: []
- compiler:
spec: [email protected]
paths:
cc: cc
cxx: CC
f77: ftn
fc: ftn
flags: {}
operating_system: sles15
target: any
modules:
- PrgEnv-gnu
- gcc/10.3.0
environment: {}
extra_rpaths: []
- compiler:
spec: [email protected]
paths:
cc: cc
cxx: CC
f77: ftn
fc: ftn
flags: {}
operating_system: sles15
target: any
modules:
- PrgEnv-gnu
- gcc/11.2.0
environment: {}
extra_rpaths: []
- compiler:
spec: [email protected]
paths:
cc: cc
cxx: CC
f77: ftn
fc: ftn
flags: {}
operating_system: sles15
target: any
modules:
- PrgEnv-amd
- rocmcc/2.14-nosmp
environment: {}
extra_rpaths: []
- compiler:
spec: [email protected]
paths:
cc: cc
cxx: CC
f77: ftn
fc: ftn
flags: {}
operating_system: sles15
target: any
modules:
- PrgEnv-amd
- rocmcc/2.14
environment: {}
extra_rpaths: []
- Populate
$SPACK_USER_CONFIG_PATH/packages.yaml
with the following
$SPACK_USER_CONFIG_PATH/packages.yaml
packages:
all:
providers:
mpi: [cray-mpich]
cray-mpich:
externals:
- spec: [email protected]%[email protected]
modules:
- PrgEnv-gnu
- gcc/10.2.0
- cray-mpich/8.1.9
- spec: [email protected]%[email protected]
# Pretend mpich is version 8.1.6 due to the following bug in spack
# https://github.com/spack/spack/issues/29459
# Only with aocc compiler
modules:
- PrgEnv-aocc
- aocc/3.0.0
- cray-mpich/8.1.9
prefix: /opt/cray/pe/mpich/8.1.9/ofi/aocc/3.0
- spec: [email protected]%[email protected]
modules:
- PrgEnv-cray
- cce/12.0.3
- cray-mpich/8.1.9
buildable: False
cray-libsci:
externals:
- spec: [email protected]
modules:
- cray-libsci/21.04.1.1
- spec: [email protected]
modules:
- cray-libsci/21.08.1.2
environment:
append_path:
- PE_PKGCONFIG_PRODUCTS: PE_MPICH
buildable: False
amdblis:
externals:
- spec: [email protected]
modules:
- aocl/3.1
buildable: False
amdfftw:
externals:
- spec: [email protected]
modules:
- aocl/3.1
buildable: False
amdlibflame:
externals:
- spec: [email protected]
modules:
- aocl/3.1
buildable: False
amdlibm:
externals:
- spec: [email protected]
modules:
- aocl/3.1
buildable: False
amdscalapack:
externals:
- spec: [email protected]
modules:
- aocl/3.1
buildable: False
aocl-sparse:
externals:
- spec: [email protected]
modules:
- aocl/3.1
buildable: False
python:
externals:
- spec: [email protected]
modules:
- cray-python/3.9.4.1
- spec: [email protected]
modules:
- cray-python/3.8.5.0
buildable: False
flex:
externals:
- spec: [email protected]+lex
prefix: /usr
# System texinfo causes issues!
# texinfo:
# externals:
# - spec: [email protected]
# prefix: /usr
automake:
externals:
- spec: [email protected]
prefix: /usr
binutils:
externals:
- spec: [email protected]
prefix: /usr
bison:
externals:
- spec: [email protected]
prefix: /usr
libtool:
externals:
- spec: [email protected]
prefix: /usr
m4:
externals:
- spec: [email protected]
prefix: /usr
openssh:
externals:
- spec: [email protected]
prefix: /usr
gawk:
externals:
- spec: [email protected]
prefix: /usr
pkg-config:
externals:
- spec: [email protected]
prefix: /usr
buildable: False
pkgconf:
externals:
- spec: [email protected]
prefix: /usr
buildable: False
openssl:
externals:
- spec: [email protected]
prefix: /usr
groff:
externals:
- spec: [email protected]
prefix: /usr
autoconf:
externals:
- spec: [email protected]
prefix: /usr
findutils:
externals:
- spec: [email protected]
prefix: /usr
git:
externals:
- spec: [email protected]~tcltk
prefix: /usr
tar:
externals:
- spec: [email protected]
prefix: /usr
subversion:
externals:
- spec: [email protected]
prefix: /usr
cmake:
externals:
- spec: [email protected]
prefix: /usr
- spec: [email protected]
prefix: /work/y07/shared/utils/core/cmake/3.21.3
gmake:
externals:
- spec: [email protected]
prefix: /usr
diffutils:
externals:
- spec: [email protected]
prefix: /usr
-
Clone the repo:
git clone https://github.com/firedrakeproject/firedrake-spack.git # or git clone [email protected]:firedrakeproject/firedrake-spack.git
-
Add the repository to spack
spack repo add <repo directory>
-
Create an spack environment
spack env create -d ./firedrake
-
Activate that environment
spack env activate -p ./firedrake
-
To avoid a bunch of errors add a whole bunch of packages to the development package list:
spack develop py-firedrake@develop spack develop libsupermesh@develop spack develop petsc@develop spack develop chaco@petsc spack develop py-fiat@develop spack develop py-finat@develop spack develop py-islpy@develop spack develop py-petsc4py@develop spack develop py-pyadjoint@develop spack develop py-pyop2@develop spack develop py-coffee@develop spack develop py-loopy@develop spack develop py-cgen@develop spack develop py-codepy@develop spack develop py-genpy@develop spack develop py-tsfc@develop spack develop py-ufl@develop
-
To work around the fact that
/home
is not accessible from compute nodes the location of the firedrake repo needs to be set inside$SPACK_ENV/spack.yaml
:spack: repos: - /work/[budget code]/[budget code]/[$USER]/workspace/firedrake-spack
Where the path should point to the repo cloned in step 1.
-
Then custom configure the packages inside
$SPACK_ENV/spack.yaml
:spack: include: - packages.yaml
-
And create
$SPACK_ENV/packages.yaml
with contents:packages: all: compiler: [[email protected]] providers: mpi: [cray-mpich] blas: [cray-libsci] lapack: [cray-libsci] scalapack: [cray-libsci] netlib-scalapack: externals: # Pretend cray-libsci is netlib-scalapack - spec: [email protected] modules: - cray-libsci/21.08.1.2 buildable: False py-numpy: externals: - spec: [email protected] modules: - cray-libsci/21.08.1.2 - cray-python/3.9.4.1 prefix: /opt/cray/pe/python/3.9.4.1/ buildable: False py-scipy: externals: - spec: [email protected] modules: - cray-libsci/21.08.1.2 - cray-python/3.9.4.1 prefix: /opt/cray/pe/python/3.9.4.1/ buildable: False
If using AMD or Cray compilers replace the compiler above (
[[email protected]]
) with[[email protected]]
or[[email protected]]
. -
Add Firedrake to the spec for the environment
spack add py-firedrake@develop \ %[email protected] \ ^[email protected] \ ^[email protected]%[email protected] \ ^[email protected] \ ^firedrake.petsc@develop+fftw \ cflags=\"-O3 -march=native -mtune=native\" \ cxxflags=\"-O3 -march=native -mtune=native\" \ fflags=\"-O3 -march=native -mtune=native -ffree-line-length-512\"
-
Concretize (and log)
spack concretize -f 2>&1 | tee $SPACK_ENV/conc.log
- Splitting the concretize and install steps into 2 distinct parts allows you to review what is being installed before the lengthy installation process begins.
-
Install (and log)
spack install --fail-fast 2>&1 | tee $SPACK_ENV/spack-firedrake-install.log
Testing must be run on a compute node. An interactive session can be started using
srun --nodes=1 --exclusive --time=00:20:00 \
--partition=standard --qos=short --reservation=shortqos \
--account=[budget code] --pty /bin/bash
Alternatively submit a jobscript, see additional notes.
- Test you can import Firedrake by running
python -c "from firedrake import *"
- If this fails, before trying anything else, deactivate the environment with
spack env deactivate
and reactivate withspack env activate -p ./firedrake
(as above) and try runningpython -c "from firedrake import *"
again. This appears to be a shortcoming of spack.
- If this fails, before trying anything else, deactivate the environment with
- Run the basic functionality tests:
cd $SPACK_ENV/py-firedrake pytest tests/regression/ -m "not parallel" -k "poisson_strong or stokes_mini or dg_advection"
- When running the environment on the compute nodes the variable
$HOME
needs to be set to something like/work/_projectcode_/_projectcode_/_username_
or/tmp
as many dependencies expect to find configuration in$HOME
. - We don't run parallel tests due to an issue with the current parallel pytest hook, this is in the process of being fixed.
- When running the environment on the compute nodes the variable
- Run the full test suite:
cd $SPACK_ENV/src/firedrake pytest tests -m "not parallel"
If you pip install a Python package on ARCHER2 with or without the Spack environment active, it will get installed as a user, and as such be located in the $HOME
directory. This is an issue since the $HOME
directory is not mounted on the compute nodes.
The obvious solution to this is not to use pip to install the package and instead attempt to add it to the environment. If the PyPI package is named foo
, the corresponding Spack package is likely called py-foo
. Spack's builtin repository can be searched by executing spack list foo
or spack list py-foo
.
If the package is not present the correct solution is to create one and upstream it into Spack's builtin package repository. There are instructions in Spack's documentation on how to do this.
This page is based off installation notes available here. These notes contain more details of bugs you may encounter and how to build with other compilers.
Example jobscript:
#!/bin/bash
#SBATCH -p standard
#SBATCH -A [budget code]
#SBATCH -J spack
#SBATCH --nodes=1
#SBATCH --cpus-per-task=1
#SBATCH --qos=standard
#SBATCH -t 0:10:00
myScript="run.sh"
module purge
module load load-epcc-module
module load PrgEnv-gnu/8.1.0
module swap gcc gcc/10.2.0
module swap cray-mpich cray-mpich/8.1.9
module swap cray-dsmml cray-dsmml/0.2.1
module swap cray-libsci cray-libsci/21.08.1.2
module load xpmem
module load cmake/3.21.3
module load cray-python/3.9.4.1
export HOME=/work/[budget code]/[budget code]/[$USER]
source /work/[budget code]/[budget code]/[$USER]/workspace/spack/share/spack/setup-env.sh
spack env activate /work/[budget code]/[budget code]/[$USER]/workspace/firedrake
./${myScript}
Building locally
Tips
- Running Firedrake tests with different subpackage branches
- Modifying and Rebuilding PETSc and petsc4py
- Vectorisation
- Debugging C kernels with
lldb
on MacOS - Parallel MPI Debugging with
tmux-mpi
,pdb
andgdb
- Parallel MPI Debugging with VSCode and
debugpy
- Modifying generated code
- Kernel profiling with LIKWID
- breakpoint() builtin not working
- Debugging pytest with multiple processing
Developers Notes
- Upcoming meeting 2024-08-21
- 2024-08-07
- 2024-07-24
- 2024-07-17
- 2024-07-10
- 2024-06-26
- 2024-06-19
- 2024-06-05
- 2024-05-29
- 2024-05-15
- 2024-05-08
- 2024-05-01
- 2024-04-28
- 2024-04-17
- 2024-04-10
- 2024-04-03
- 2024-03-27
- 2024-03-20
- 2024-03-06
- 2024-02-28
- 2024-02-28
- 2024-02-21
- 2024-02-14
- 2024-02-07
- 2024-01-31
- 2024-01-24
- 2024-01-17
- 2024-01-10
- 2023-12-13
- 2023-12-06
- 2023-11-29
- 2023-11-22
- 2023-11-15
- 2023-11-08
- 2023-11-01
- 2023-10-25
- 2023-10-18
- 2023-10-11
- 2023-10-04
- 2023-09-27
- 2023-09-20
- 2023-09-06
- 2023-08-30
- 2023-08-23
- 2023-07-12
- 2023-07-05
- 2023-06-21
- 2023-06-14
- 2023-06-07
- 2023-05-17
- 2023-05-10
- 2023-03-08
- 2023-02-22
- 2023-02-15
- 2023-02-08
- 2023-01-18
- 2023-01-11
- 2023-12-14
- 2022-12-07
- 2022-11-23
- 2022-11-16
- 2022-11-09
- 2022-11-02
- 2022-10-26
- 2022-10-12
- 2022-10-05
- 2022-09-28
- 2022-09-21
- 2022-09-14
- 2022-09-07
- 2022-08-25
- 2022-08-11
- 2022-08-04
- 2022-07-28
- 2022-07-21
- 2022-07-07
- 2022-06-30
- 2022-06-23
- 2022-06-16
- 2022-05-26
- 2022-05-19
- 2022-05-12
- 2022-05-05
- 2022-04-21
- 2022-04-07
- 2022-03-17
- 2022-03-03
- 2022-02-24
- 2022-02-10
- 2022-02-03
- 2022-01-27
- 2022-01-20
- 2022-01-13
- 2021-12-15
- 2021-12-09
- 2021-11-25
- 2021-11-18
- 2021-11-11
- 2021-11-04
- 2021-10-28
- 2021-10-21
- 2021-10-14
- 2021-10-07
- 2021-09-30
- 2021-09-23
- 2021-09-09
- 2021-09-02
- 2021-08-26
- 2021-08-18
- 2021-08-11
- 2021-08-04
- 2021-07-28
- 2021-07-21
- 2021-07-14
- 2021-07-07
- 2021-06-30
- 2021-06-23
- 2021-06-16
- 2021-06-09
- 2021-06-02
- 2021-05-19
- 2021-05-12
- 2021-05-05
- 2021-04-28
- 2021-04-21
- 2021-04-14
- 2021-04-07
- 2021-03-17
- 2021-03-10
- 2021-02-24
- 2021-02-17
- 2021-02-10
- 2021-02-03
- 2021-01-27
- 2021-01-20
- 2021-01-13
- 2021-01-06