Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fit24f #926

Open
wants to merge 3 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions jupyter-images/fit24f/.condarc
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
envs_dirs:
- /home/jovyan/additional-envs
43 changes: 43 additions & 0 deletions jupyter-images/fit24f/Acknowledgements.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
{
"cells": [
{
"cell_type": "markdown",
"id": "c86cd54f-b73c-4781-b6eb-89c79d3d3b22",
"metadata": {},
"source": [
"## Acknowledgements\n",
"\n",
"Launching this JupyterHub server is the result of a collaboration between several research and academic institutions and their staff. For Jetstream2 and JupyterHub expertise, we thank Andrea Zonca (San Diego Supercomputing Center), Jeremy Fischer, Mike Lowe (Indiana University), the NSF Jetstream2 (`doi:10.1145/3437359.3465565`) team.\n",
"\n",
"This work employs the NSF Jetstream2 Cloud at Indiana University through allocation EES220002 from the Advanced Cyberinfrastructure Coordination Ecosystem: Services & Support (ACCESS) program, which is supported by National Science Foundation grants #2138259, #2138286, #2138307, #2137603, and #2138296.\n",
"\n",
"Unidata is one of the University Corporation for Atmospheric Research (UCAR)'s Community Programs (UCP), and is funded primarily by the National Science Foundation (AGS-1901712).\n",
"\n",
"## To Acknowledge This JupyterHub and the Unidata Science Gateway\n",
"\n",
"If you have benefited from the Unidata Science Gateway, please cite `doi:10.5065/688s-2w73`. Additional citation information can be found in this [Citation File Format file](https://raw.githubusercontent.com/Unidata/science-gateway/master/CITATION.cff).\n"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.6"
}
},
"nbformat": 4,
"nbformat_minor": 5
}
38 changes: 38 additions & 0 deletions jupyter-images/fit24f/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
# Heavily borrowed from docker-stacks/minimal-notebook/
# https://github.com/jupyter/docker-stacks/blob/main/minimal-notebook/Dockerfile

ARG BASE_CONTAINER=jupyter/minimal-notebook
FROM $BASE_CONTAINER

ENV DEFAULT_ENV_NAME=fit24f

LABEL maintainer="Unidata <[email protected]>"

USER root

RUN apt-get update && \
apt-get install -y --no-install-recommends vim curl && \
apt-get clean && \
rm -rf /var/lib/apt/lists/*

USER $NB_UID

ADD environment.yml /tmp

RUN mamba install --quiet --yes \
'conda-forge::nb_conda_kernels' \
'conda-forge::jupyterlab-git' \
'conda-forge::ipywidgets' && \
mamba env update --name $DEFAULT_ENV_NAME -f /tmp/environment.yml && \
pip install --no-cache-dir nbgitpuller && \
mamba clean --all -f -y && \
jupyter lab clean -y && \
npm cache clean --force && \
rm -rf /home/$NB_USER/.cache/yarn && \
rm -rf /home/$NB_USER/.node-gyp && \
fix-permissions $CONDA_DIR && \
fix-permissions /home/$NB_USER

COPY update_material.ipynb Acknowledgements.ipynb default_kernel.py .condarc /

USER $NB_UID
52 changes: 52 additions & 0 deletions jupyter-images/fit24f/additional_kernels.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,52 @@
{
"cells": [
{
"cell_type": "markdown",
"id": "a9d9cf3f-590d-40ef-8421-a9789a03bb07",
"metadata": {},
"source": [
"### Creating Additional Kernels\n",
"\n",
"You can also create additional kernels and have them be available via the kernel menu. Your kernel must contain the `nb_conda_kernels` and `ipykernel` packages for this to work. For example, if you wish to have a kernel with the `seaborn` package, you can create the following `environment.yml` from the terminal with the `pico` editor:\n",
"\n",
"```yaml\n",
" name: myenv\n",
" channels:\n",
" - conda-forge\n",
" dependencies:\n",
" - python=3\n",
" - seaborn\n",
" - nb_conda_kernels\n",
" - ipykernel\n",
"```\n",
"\n",
"followed by\n",
"\n",
"`mamba env update --name myenv -f environment.yml`\n",
"\n",
"at this point `myenv` will be available via the `Kernel → Change kernel...` menu."
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.6"
}
},
"nbformat": 4,
"nbformat_minor": 5
}
18 changes: 18 additions & 0 deletions jupyter-images/fit24f/allowed_users.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
---
# When there are a lot of users we keep secrets.yaml short by using a separate
# file (this file) for the allow list. To ensure this file is read by
# JupyterHub, add the `--values allowed_users.yaml` option to the `helm upgrade`
# command in `install_jhub.sh`.
#
# IMPORTANT: As you'll likely copy and paste usernames from a spreadsheet onto
# here, ensure that all usernames consist of only lower-case characters. See:
# https://discourse.jupyter.org/t/spawner-unnecessarily-encoding-capital-letters-leading-to-pvc-creation-errors-and-jhub-crash/17704
#
# You can do this in vim by selecting a block of text in Visual Line mode (`V`),
# selecting the appropriate lines, then using the `gu` command

hub:
config:
Authenticator:
allowed_users:
- users
28 changes: 28 additions & 0 deletions jupyter-images/fit24f/build.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
#!/bin/bash

# Check if an image name is provided
if [ -z "$1" ]; then
echo "Error: No image name provided."
echo "Usage: $0 <image-name>"
exit 1
fi

IMAGE_NAME=$1

DATE_TAG=$(date "+%Y%b%d_%H%M%S")
RANDOM_HEX=$(openssl rand -hex 2)
TAG="${DATE_TAG}_${RANDOM_HEX}"

FULL_TAG="unidata/$IMAGE_NAME:$TAG"

echo "Building Docker image with tag: $FULL_TAG"

docker build --no-cache --pull --tag "$FULL_TAG" .

# Check if the build was successful
if [ $? -eq 0 ]; then
echo "Docker image built successfully: $FULL_TAG"
else
echo "Error: Docker build failed."
exit 1
fi
67 changes: 67 additions & 0 deletions jupyter-images/fit24f/default_kernel.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,67 @@
#!/usr/bin/env python

import argparse
import glob
import json
import os
import re


def update_kernelspec_in_notebooks(directory, new_name):
"""
Updates the kernelspec in all Jupyter Notebook files within the specified
directory and its subdirectories, while preserving the original file
formatting.

Args:
directory (str): The path to the directory containing .ipynb files.
new_name (str): The new name to set in the kernelspec.
"""
for file_path in glob.glob(f'{directory}/**/*.ipynb', recursive=True):
try:
with open(file_path, 'r', encoding='utf-8') as file:
file_contents = file.read()
notebook = json.loads(file_contents)

if 'kernelspec' not in notebook.get('metadata', {}):
print(f"No kernelspec found in {file_path}. Skipping file.")
continue

kernelspec = notebook['metadata']['kernelspec']
kernelspec['display_name'] = f"Python [conda env:{new_name}]"
kernelspec['name'] = f"conda-env-{new_name}-py"

# Convert the updated kernelspec dictionary to a JSON-formatted
# string with indentation
updated_kernelspec = json.dumps(kernelspec, indent=4)

# Replace the existing kernelspec section in the original file
# contents with the updated JSON string. The regular expression
# looks for the "kernelspec" key and replaces its entire value
# (including nested structures), preserving the overall structure
# and formatting of the file.
updated_contents = re.sub(
r'"kernelspec": \{.*?\}',
f'"kernelspec": {updated_kernelspec}',
file_contents, flags=re.DOTALL
)

with open(file_path, 'w', encoding='utf-8') as file:
file.write(updated_contents)

except Exception as e:
print(f"Error processing file {file_path}: {e}")


if __name__ == "__main__":
parser = argparse.ArgumentParser(description="Update the kernel name in "
"Jupyter Notebook files in directory "
"tree.")
parser.add_argument("new_kernel_name", help="New kernel name to set.")
parser.add_argument("directory_path", nargs='?', default=os.getcwd(),
help="Directory containing .ipynb files (default: "
"current directory).")

args = parser.parse_args()

update_kernelspec_in_notebooks(args.directory_path, args.new_kernel_name)
24 changes: 24 additions & 0 deletions jupyter-images/fit24f/environment.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
name: fit24f
channels:
- conda-forge
dependencies:
# Required by JupyterLab
- python=3
- nb_conda_kernels
- ipykernel
# User requested packages
- tensorflow
- statsmodels
- seaborn
- scipy
- arm_pyart
- netcdf4
- keras
- gdal
- geotiff
- geoviews
- geopandas
- pip:
# It is recommended to install a package using pip as a last resort, i.e.
# when it is not found in the conda repos
- palmerpenguins
110 changes: 110 additions & 0 deletions jupyter-images/fit24f/secrets.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,110 @@
hub:
cookieSecret: "xxx"
config:
Authenticator:
admin_users:
- admins
#If you have a large list of users, consider using allowed_users.yaml
allowed_users:
- users
GitHubOAuthenticator:
client_id: "xxx"
client_secret: "xxx"
oauth_callback_url: "https://fit24f-1.ees220002.projects.jetstream-cloud.org:443/oauth_callback"
JupyterHub:
authenticator_class: github
extraConfig:
01-no-labels: |
from kubespawner import KubeSpawner
class CustomSpawner(KubeSpawner):
def _build_common_labels(self, extra_labels):
labels = super()._build_common_labels(extra_labels)
# Until https://github.com/jupyterhub/kubespawner/issues/498
# is fixed
del labels['hub.jupyter.org/username']
return labels
c.JupyterHub.spawner_class = CustomSpawner


proxy:
secretToken: "xxx"

ingress:
enabled: true
annotations:
kubernetes.io/ingress.class: "nginx"
cert-manager.io/cluster-issuer: "letsencrypt"
#For manually issuing certificates: see vms/jupyter/readme.md
#cert-manager.io/issuer: "incommon"
nginx.ingress.kubernetes.io/proxy-body-size: 500m
hosts:
- "fit24f-1.ees220002.projects.jetstream-cloud.org"
tls:
- hosts:
- "fit24f-1.ees220002.projects.jetstream-cloud.org"
secretName: certmanager-tls-jupyterhub

#For having a dedicated core node: see vms/jupyter/readme.md
#scheduling:
# corePods:
# tolerations:
# - key: hub.jupyter.org/dedicated
# operator: Equal
# value: core
# effect: NoSchedule
# - key: hub.jupyter.org_dedicated
# operator: Equal
# value: core
# effect: NoSchedule
# nodeAffinity:
# matchNodePurpose: require

singleuser:
extraEnv:
NBGITPULLER_DEPTH: "0"
storage:
capacity: 10Gi
startTimeout: 600
memory:
guarantee: 4G
limit: 4G
cpu:
guarantee: 1
limit: 1
defaultUrl: "/lab"
image:
name: "unidata/fit24f"
tag: "xxx"
lifecycleHooks:
postStart:
exec:
command:
- "bash"
- "-c"
- >
dir="/home/jovyan/.ssh"; [ -d $dir ] && { chmod 700 $dir && \
chmod -f 600 $dir/* && chmod -f 644 $dir/*.pub; } || true;
cp -t /home/jovyan /Acknowledgements.ipynb /update_material.ipynb;
python /default_kernel.py $DEFAULT_ENV_NAME /home/jovyan;
[[ -f $HOME/.bashrc ]] || cp /etc/skel/.bashrc;
[[ -f $HOME/.profile ]] || cp /etc/skel/.profile;
[[ -f $HOME/.bash_logout ]] || cp /etc/skel/.bash_logout;
[[ -f $HOME/.condarc ]] || cp /.condarc;
gitpuller https://github.com/slazmo/MET3601 main MET3601 || true;
#Multiple profiles: see vms/jupyter/readme.md
#profileList:
#- display_name: "High Power (default)"
# description: "12 GB of memory; up to 4 vCPUs"
# kubespawner_override:
# mem_guarantee: 12G
# mem_limit: 12G
# cpu_guarantee: 2
# cpu_limit: 4
# default: true
#- display_name: "Low Power"
# description: "6 GB of memory; up to 2 vCPUS"
# kubespawner_override:
# mem_guarantee: 6G
# mem_limit: 6G
# cpu_guarantee: 1
# cpu_limit: 2
Loading