Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Init auh24s #892

Open
wants to merge 4 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions jupyter-images/uah24s/.condarc
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
envs_dirs:
- /home/jovyan/additional-envs
43 changes: 43 additions & 0 deletions jupyter-images/uah24s/Acknowledgements.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
{
"cells": [
{
"cell_type": "markdown",
"id": "c86cd54f-b73c-4781-b6eb-89c79d3d3b22",
"metadata": {},
"source": [
"## Acknowledgements\n",
"\n",
"Launching this JupyterHub server is the result of a collaboration between several research and academic institutions and their staff. For Jetstream2 and JupyterHub expertise, we thank Andrea Zonca (San Diego Supercomputing Center), Jeremy Fischer, Mike Lowe (Indiana University), the NSF Jetstream2 (`doi:10.1145/3437359.3465565`) team.\n",
"\n",
"This work employs the NSF Jetstream2 Cloud at Indiana University through allocation EES220002 from the Advanced Cyberinfrastructure Coordination Ecosystem: Services & Support (ACCESS) program, which is supported by National Science Foundation grants #2138259, #2138286, #2138307, #2137603, and #2138296.\n",
"\n",
"Unidata is one of the University Corporation for Atmospheric Research (UCAR)'s Community Programs (UCP), and is funded primarily by the National Science Foundation (AGS-1901712).\n",
"\n",
"## To Acknowledge This JupyterHub and the Unidata Science Gateway\n",
"\n",
"If you have benefited from the Unidata Science Gateway, please cite `doi:10.5065/688s-2w73`. Additional citation information can be found in this [Citation File Format file](https://raw.githubusercontent.com/Unidata/science-gateway/master/CITATION.cff).\n"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.6"
}
},
"nbformat": 4,
"nbformat_minor": 5
}
41 changes: 41 additions & 0 deletions jupyter-images/uah24s/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
# Heavily borrowed from docker-stacks/minimal-notebook/
# https://github.com/jupyter/docker-stacks/blob/main/minimal-notebook/Dockerfile

ARG BASE_CONTAINER=jupyter/minimal-notebook
FROM $BASE_CONTAINER

ENV DEFAULT_ENV_NAME=uah24s

LABEL maintainer="Unidata <[email protected]>"

USER root

RUN apt-get update && \
apt-get install -y --no-install-recommends vim curl ffmpeg && \
apt-get clean && \
rm -rf /var/lib/apt/lists/*

USER $NB_UID

COPY environment-tf.yaml environment-numba.yaml additional-env.yaml /tmp

RUN mamba install --quiet --yes \
'conda-forge::nb_conda_kernels' \
'conda-forge::jupyterlab-git' \
'conda-forge::ipywidgets' && \
mamba env update --name $DEFAULT_ENV_NAME -f /tmp/environment-tf.yaml && \
mamba env update --name $DEFAULT_ENV_NAME -f /tmp/additional-env.yaml && \
mamba env update --name ${DEFAULT_ENV_NAME}-numba -f /tmp/environment-numba.yaml && \
mamba env update --name ${DEFAULT_ENV_NAME}-numba -f /tmp/additional-env.yaml && \
pip install --no-cache-dir nbgitpuller && \
mamba clean --all -f -y && \
jupyter lab clean -y && \
npm cache clean --force && \
rm -rf /home/$NB_USER/.cache/yarn && \
rm -rf /home/$NB_USER/.node-gyp && \
fix-permissions $CONDA_DIR && \
fix-permissions /home/$NB_USER

COPY update_material.ipynb Acknowledgements.ipynb default_kernel.py /

USER $NB_UID
11 changes: 11 additions & 0 deletions jupyter-images/uah24s/additional-env.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
name: additional-env
channels:
- conda-forge
dependencies:
# User requested packages
- seaborn
- pip:
# It is recommended to install a package using pip as a last resort, i.e.
# when it is not found in the conda repos
- optuna
- jupyterlab-optuna
52 changes: 52 additions & 0 deletions jupyter-images/uah24s/additional_kernels.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,52 @@
{
"cells": [
{
"cell_type": "markdown",
"id": "a9d9cf3f-590d-40ef-8421-a9789a03bb07",
"metadata": {},
"source": [
"### Creating Additional Kernels\n",
"\n",
"You can also create additional kernels and have them be available via the kernel menu. Your kernel must contain the `nb_conda_kernels` and `ipykernel` packages for this to work. For example, if you wish to have a kernel with the `seaborn` package, you can create the following `environment.yml` from the terminal with the `pico` editor:\n",
"\n",
"```yaml\n",
" name: myenv\n",
" channels:\n",
" - conda-forge\n",
" dependencies:\n",
" - python=3\n",
" - seaborn\n",
" - nb_conda_kernels\n",
" - ipykernel\n",
"```\n",
"\n",
"followed by\n",
"\n",
"`conda env update --name myenv -f environment.yml`\n",
"\n",
"at this point `myenv` will be available via the `Kernel → Change kernel...` menu."
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.6"
}
},
"nbformat": 4,
"nbformat_minor": 5
}
28 changes: 28 additions & 0 deletions jupyter-images/uah24s/build.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
#!/bin/bash

# Check if an image name is provided
if [ -z "$1" ]; then
echo "Error: No image name provided."
echo "Usage: $0 <image-name>"
exit 1
fi

IMAGE_NAME=$1

DATE_TAG=$(date "+%Y%b%d_%H%M%S")
RANDOM_HEX=$(openssl rand -hex 2)
TAG="${DATE_TAG}_${RANDOM_HEX}"

FULL_TAG="unidata/$IMAGE_NAME:$TAG"

echo "Building Docker image with tag: $FULL_TAG"

docker build --no-cache --pull --tag "$FULL_TAG" .

# Check if the build was successful
if [ $? -eq 0 ]; then
echo "Docker image built successfully: $FULL_TAG"
else
echo "Error: Docker build failed."
exit 1
fi
67 changes: 67 additions & 0 deletions jupyter-images/uah24s/default_kernel.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,67 @@
#!/usr/bin/env python

import argparse
import glob
import json
import os
import re


def update_kernelspec_in_notebooks(directory, new_name):
"""
Updates the kernelspec in all Jupyter Notebook files within the specified
directory and its subdirectories, while preserving the original file
formatting.

Args:
directory (str): The path to the directory containing .ipynb files.
new_name (str): The new name to set in the kernelspec.
"""
for file_path in glob.glob(f'{directory}/**/*.ipynb', recursive=True):
try:
with open(file_path, 'r', encoding='utf-8') as file:
file_contents = file.read()
notebook = json.loads(file_contents)

if 'kernelspec' not in notebook.get('metadata', {}):
print(f"No kernelspec found in {file_path}. Skipping file.")
continue

kernelspec = notebook['metadata']['kernelspec']
kernelspec['display_name'] = f"Python [conda env:{new_name}]"
kernelspec['name'] = f"conda-env-{new_name}-py"

# Convert the updated kernelspec dictionary to a JSON-formatted
# string with indentation
updated_kernelspec = json.dumps(kernelspec, indent=4)

# Replace the existing kernelspec section in the original file
# contents with the updated JSON string. The regular expression
# looks for the "kernelspec" key and replaces its entire value
# (including nested structures), preserving the overall structure
# and formatting of the file.
updated_contents = re.sub(
r'"kernelspec": \{.*?\}',
f'"kernelspec": {updated_kernelspec}',
file_contents, flags=re.DOTALL
)

with open(file_path, 'w', encoding='utf-8') as file:
file.write(updated_contents)

except Exception as e:
print(f"Error processing file {file_path}: {e}")


if __name__ == "__main__":
parser = argparse.ArgumentParser(description="Update the kernel name in "
"Jupyter Notebook files in directory "
"tree.")
parser.add_argument("new_kernel_name", help="New kernel name to set.")
parser.add_argument("directory_path", nargs='?', default=os.getcwd(),
help="Directory containing .ipynb files (default: "
"current directory).")

args = parser.parse_args()

update_kernelspec_in_notebooks(args.directory_path, args.new_kernel_name)
33 changes: 33 additions & 0 deletions jupyter-images/uah24s/environment-numba.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
name: uah24s-numba
channels:
- conda-forge
dependencies:
# Required by JupyterLab
- python=3
- nb_conda_kernels
- ipykernel
# User requested packages
- numpy
- matplotlib
- cartopy
- metpy
- siphon
- pandas
- pip
- xarray
- ipywidgets
- python-awips
- scikit-learn
- tobac
- s3fs
- arm_pyart
- netCDF4
- zarr
# numba and cuda
# See https://numba.pydata.org/numba-doc/latest/user/installing.html
- numba
- cudatoolkit
- pip:
# It is recommended to install a package using pip as a last resort, i.e.
# when it is not found in the conda repos
- palmerpenguins
42 changes: 42 additions & 0 deletions jupyter-images/uah24s/environment-tf.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,42 @@
name: uah24s
channels:
- conda-forge
dependencies:
# Required by JupyterLab
- python=3
- nb_conda_kernels
- ipykernel
# User requested packages
- numpy
- matplotlib
- cartopy
- metpy
- siphon
- pandas
- pip
- xarray
- ipywidgets
- python-awips
- scikit-learn
- tobac
- s3fs
- arm_pyart
- netCDF4
- zarr
- pip:
# It is recommended to install a package using pip as a last resort, i.e.
# when it is not found in the conda repos
- palmerpenguins
- tensorflow==2.15.post1
- nvidia-cublas-cu12==12.2.5.6
- nvidia-cuda-cupti-cu12==12.2.142
- nvidia-cuda-nvcc-cu12==12.2.140
- nvidia-cuda-nvrtc-cu12==12.2.140
- nvidia-cuda-runtime-cu12==12.2.140
- nvidia-cudnn-cu12==8.9.4.25
- nvidia-cufft-cu12==11.0.8.103
- nvidia-curand-cu12==10.3.3.141
- nvidia-cusolver-cu12==11.5.2.141
- nvidia-cusparse-cu12==12.1.2.141
- nvidia-nccl-cu12==2.16.5
- nvidia-nvjitlink-cu12==12.2.140
66 changes: 66 additions & 0 deletions jupyter-images/uah24s/gpu/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,66 @@
# Heavily borrowed from docker-stacks
# https://github.com/jupyter/docker-stacks/blob/main/docker-stacks-foundation/Dockerfile

ARG BASE_CONTAINER=nvcr.io/nvidia/tensorflow:22.04-tf2-py3
FROM $BASE_CONTAINER

LABEL maintainer="Unidata <[email protected]>"

# Fix: https://github.com/hadolint/hadolint/wiki/DL4006
# Fix: https://github.com/koalaman/shellcheck/wiki/SC3014
SHELL ["/bin/bash", "-o", "pipefail", "-c"]

ENV DEBIAN_FRONTEND noninteractive
RUN apt-get update --yes && \
apt-get upgrade --yes && \
apt-get install --yes --no-install-recommends \
bzip2 ca-certificates locales sudo wget software-properties-common \
libproj-dev proj-data proj-bin libgeos-dev ffmpeg && \
# updating Python messes up Tensorflow from base container, unfort.
# add-apt-repository ppa:deadsnakes/ppa && apt-get update --yes && \
# apt-get install -y python3.10 python3.10-distutils && \
# curl -sS https://bootstrap.pypa.io/get-pip.py | python3.10 && \
# ln -sfn /usr/bin/python3.10 /usr/bin/python3 && \
# ln -sfn /usr/bin/python3 /usr/bin/python && \
# ln -sfn /usr/bin/pip3 /usr/bin/pip && \
python3 -m pip install --no-cache-dir jupyterhub==3.0.0 jupyterlab>=3 \
notebook jupyter_server cartopy catboost metpy minisom netCDF4 pillow \
pyvista[all,trame] pyvista-xarray seaborn shapely torch torchaudio \
torchvision verde xarray ipywidgets jupyterlab_widgets \
jupyter-server-proxy --upgrade && \
python3 -m pip uninstall vtk -y && \
python3 -m pip install --no-cache-dir --upgrade --extra-index-url \
https://wheels.vtk.org vtk-osmesa --extra-index-url \
https://download.pytorch.org/whl/cu112 && \
apt-get clean && rm -rf /var/lib/apt/lists/* && \
echo "en_US.UTF-8 UTF-8" > /etc/locale.gen && \
locale-gen

ARG NB_USER="jovyan"
ARG NB_UID="1000"
ARG NB_GID="100"

COPY fix-permissions /usr/local/bin/fix-permissions
RUN chmod a+rx /usr/local/bin/fix-permissions

ENV HOME="/home/${NB_USER}"

# Enable prompt color in the skeleton .bashrc before creating the default NB_USER
# hadolint ignore=SC2016
RUN sed -i 's/^#force_color_prompt=yes/force_color_prompt=yes/' /etc/skel/.bashrc

# Create NB_USER with name jovyan user with UID=1000 and in the 'users' group
# and make sure these dirs are writable by the `users` group.
RUN echo "auth requisite pam_deny.so" >> /etc/pam.d/su && \
useradd -l -m -s /bin/bash -N -u "${NB_UID}" "${NB_USER}" && \
chmod g+w /etc/passwd && \
fix-permissions "${HOME}"

COPY Acknowledgements.ipynb /
COPY gpu.ipynb /
COPY weatherbench_TF.ipynb /
COPY MNIST_Example_PyTorch.ipynb /

USER ${NB_UID}

WORKDIR "${HOME}"
Loading