diff --git a/README.md b/README.md index 398b565c..a2bfe5d7 100644 --- a/README.md +++ b/README.md @@ -1,6 +1,6 @@ # MAAP Jupyter Interface -This repo includes JupyterLab extensions that have been built for the MAAP project (https://che-k8s.maap.xyz/) +This repo includes JupyterLab extensions that have been built for the MAAP project (https://www.maap-project.org/) In order to use each of these extensions they must be installed and enabled in your environment. Instructions for each extension can be found in the respective folder. Make sure each extension's dependencies are installed first. @@ -16,8 +16,8 @@ Create a new Conda environment and install common dependencies and versions: conda create --name maap-ade conda activate maap-ade conda install conda=4.7.12 jupyterlab=2.1.4 nodejs=10.13.0 gitpython=3.0.2 -jupyter labextension install jupyterlab_toastify@3.3.0 --no-build -npm i jupyterlab_toastify@3.3.0 +jupyter labextension install jupyterlab_toastify@3.0.0 +npm i jupyterlab_toastify@3.0.0 pip install plotly==4.0.0 jupyter labextension install @jupyter-widgets/jupyterlab-manager@2.0 ``` @@ -68,143 +68,35 @@ Some Jupyter Extensions/Resources we have found helpful: In JupyterLab's update to the stable 1.0 version, they have also updated and added lots of documentation on extension development. I recommend taking a look at [this](https://jupyterlab.readthedocs.io/en/stable/developer/extension_dev.html). -## Deploying Extensions as Part of Eclipse Che -### Dockerizing -Our development process involves building and running an extension locally in jupyterlab using a conda env before -installing it on the che server. To enable an extension in Che, it must be included in the base docker image/stack that a -Che workspace is launched with. The dockerfile that extensions are included in is the `Dockerfile` and the highest level -in this repo. At the point of adding your extension into the Docker image, some minor changes may have to be made -(mainly path issues). This will be explained in the bullets below. - -An instance of this repository lives on the Che server under `~/che/dockerfiles/maap-jupyter-ide`. Once an extension has been tested locally, rebuild the docker -image with your new extensions. - - -- Add your install to the Dockerfile. For example: - ```bash - # jlab pull projects into /projects directory - COPY pull_projects /pull_projects - RUN cd /pull_projects && npm run build - RUN cd /pull_projects && jupyter labextension link . - RUN cd /pull_projects && pip install -e . - RUN cd /pull_projects && jupyter serverextension enable --py pull_projects --sys-prefix - - ``` -- If your extension includes a server extension you also need to modify `entrypoint.sh`. This is because jupyter -server extensions function off of having a standard base url, but in the context of che the url is not what jupyter -thinks it is. - - Here is some magic that fixes it (add this line and replace with the path to where -your `load_jupyter_server_extension` function is) +## Steps for Contributing +### Each Contribution +- Clone repository locally to test and develop extensions. Create your own branch and once you have tested it and are ready to contribute, put a PR into the develop branch. +- After you have validated your code is working locally, the next step is to test it in the MAAP ADE in the devlopment environment. +- When there is an update to the develop branch, such as merging in a PR, the CI will be kicked off automatically and will build the image necessary for use in the MAAP ADE. This image will end in the `:develop` tag. This is intended to be used to test in the development environment. +- When updates in develop have been tested in the MAAP ADE and deemed stable, merge them into master. +### Ready for Release +- Once there have been changes that warrant a new release to the operations environment, create a new Github release from the master branch. Name the release the next corresponding number following the `v3.0.1` schema. In the notes of the release include a changelog of the updates included in that release +- Then, update the existing `stable` tag to point to the same place as the release that you just created. This can easily be done on the command line by checking out the latest code on master (or whatever commit the release was created from) and running the following: + ``` + git tag -f stable + git push -f origin stable + ``` +- The CI will also be kicked off for `stable` and `v` tags. The `stable` tag is used in the devfiles in the operations environment and automatically pulled in each new workspace, so it is very important pushing changes to this tag have been thuroughly tested. The `v` tag is not used in any environment but is used to track versions and the changes we role out. If there is an issue found in the `stable` tag we can roll it back to the previous `v` release by checking out that release and running the lines above. +- The CI trigger information can be found .github/workflows/gitlab-trigger.yml and in the actions tab + +## Docker Images and CI Process +- The CI and images built off of this code live on MAAP's hosted GitLab (https://mas.maap-project.org/root). +- Triggered from maap-jupyter-ide + - Will build corresponding maap-jupyter-ide branch on all of the base images listed in the base_images.txt file +- Triggered from Gitlab + - If triggered from a [base image](https://mas.maap-project.org/root/ade-base-images) being updated, it will only rebuild that base image with map-jupyter-ide’s stable tag on top of it. + - If triggered from an update in the [jupyter-image](https://mas.maap-project.org/root/jupyter-image) repo, it will rebuild all of the base images listed in base_iamges.txt with maap-jupyter-ide’s stable tag. +- Troubleshooting + - If the image failed to build for an unexpected reason, it is likely because the gitlab runner ran out of space. Ssh onto the runner, clean up the docker images (you can get rid of anything - everything important is on s3 backed registries) and restart the failed job. + + +## Troubleshooting +- If your extension includes a server extension you also need to modify `entrypoint.sh`. This is because jupyter server extensions function off of having a standard base url, but in the context of che the url is not what jupyter thinks it is. + - Here is some magic that fixes it (add this line and replace with the path to where your `load_jupyter_server_extension` function is) ```bash perl -pi -e "s|web_app.settings\['base_url'\]|'/'|g" /show_ssh_info/show_ssh_info/__init__.py - ``` -- Then rebuild the docker image. `microk8s.docker build -t localhost:32000/che-jupyter-lab-ide .` -- Push! `microk8s.docker push localhost:32000/che-jupyter-lab-ide ` -- Now when you build a new workspace with the `localhost:32000/che-jupyter-lab-ide` image it will automatically -fetch the new image. (found in the stack's `Recipe` or `Raw Configuration`) - - you can also specify the image tag to use in your build on the stack if you want to use a previous build -- Any change pushed to `microk8s.docker push localhost:32000/che-jupyter-lab-ide ` will affect the default stacks -on all user accounts. If you are testing something, you can create your own image and your own stack to play around with. - - -### Che Stacks -To make your custom docker image available to users in Che, you need to make a new stack that creates workspaces using your image and make it available to users. -Below is an example stack configuration using our locally built dockerized juptyer image with MAAP extensions installed. - -Make sure to replace the image name in `workspaceConfig.environments.default.recipe.image` with the location of your image. -In order for SSH-ing into the workspace to be possible, the `org.eclipse.che.exec` and `org.eclipse.che.ssh` installers must be enabled under `workspaceConfig.environments.default.machines.ws/jupyter.installers`. - -#### Che Stack Raw Configuration -``` -{ - "scope": "general", - "description": "Use this one. Stable jupyter. No extra packages", - "creator": "b07e3a58-ed50-4a6e-be17-fcf49ff8b242", - "tags": [ - "MAAP", - "JUPYTER", - "STABLE" - ], - "workspaceConfig": { - "defaultEnv": "default", - "environments": { - "default": { - "recipe": { - "contentType": "text/x-yaml", - "type": "kubernetes", - "content": "kind: List\nitems:\n - \n apiVersion: v1\n kind: Pod\n metadata:\n name: ws\n labels:\n name: ws\n spec:\n containers:\n - \n name: jupyter\n image: 'localhost:32000/stable-ide:latest'\n resources:\n limits:\n memory: 1024Mi\n securityContext:\n privileged: true\n - \n apiVersion: v1\n kind: Service\n metadata:\n name: ws\n spec:\n type: NodePort\n ports:\n - \n port: 22\n selector:\n name: ws\n \n " - }, - "machines": { - "ws/jupyter": { - "env": { - "MACHINE_NAME": "WS_JUPYTER" - }, - "servers": { - "jupyter": { - "path": "/", - "attributes": { - "cookiesAuthEnabled": "true", - "type": "ide", - "secure": "true" - }, - "protocol": "http", - "port": "3100" - } - }, - "installers": [ - "org.eclipse.che.exec", - "org.eclipse.che.ssh" - ], - "volumes": { - "projects": { - "path": "/projects" - } - }, - "attributes": {} - } - } - } - }, - "projects": [], - "commands": [], - "name": "default", - "attributes": {}, - "links": [] - }, - "components": [], - "name": "maap-jupyter-ide", - "id": "stacktdo2q0ixhv7cge00" -} -``` -#### Enabling Privileged Docker Containers -1. Cluster Privileges -- in `/var/snap/microk8s/current/args/kubelet` and `/var/snap/microk8s/current/args/kube-apiserver`, append `--allow-privileged` -- restart both services: -``` -sudo systemctl restart snap.microk8s.daemon-apiserver -sudo systemctl restart snap.microk8s.daemon-kubelet -``` - -2. Che Permissions -- in `che/dockerfiles/init/manifest/che.env`, set `CHE_DOCKER_PRIVILEGED=true` under the Privileged Mode section -- restart Che - -#### Creating and Sharing Stacks -To create a stack, you write a raw configuration with all the che and docker settings your workspace will require, including installers, volumes, docker run tags, docker images, etc. See the example above. - -To share a stack, you will need to be the owner (creator) of the stack. -Go to the homepage of where your Che instance is hosted and add `/swagger` to the end of the url for an interface with Che's API. Under the `permissions`section, make a POST request with the users you want to share with and the id of your stack (shows up at the bottom of the configuration after creation). -POST body: -``` -{ -"userId": "*", - "domainId": "stack", - "instanceId": "${STACK_ID}", - "actions": [ - "read", - "search" - ] -} -``` -reference: https://www.eclipse.org/che/docs/che-6/stacks.html#sharing-stacks-and-system-stacks -. diff --git a/dps_info/src/jobinfo.ts b/dps_info/src/jobinfo.ts index 87704676..5bcf152c 100644 --- a/dps_info/src/jobinfo.ts +++ b/dps_info/src/jobinfo.ts @@ -141,14 +141,12 @@ export class JobTable extends Widget { } async _getJobList(me:JobTable) { - // console.log(this._username); const res:RequestResult = await getJobs(me._state, me._username); if(res.ok){ let json_response:any = res.json(); - INotification.success("Get user jobs success."); - // console.log(json_response); if (json_response['status_code'] === 200){ - // let resp = json_response['result']; + INotification.success("Get user jobs success."); + me._table = json_response['result']; JOBS = json_response['jobs']; DISPLAYS = json_response['displays']; @@ -1130,8 +1128,9 @@ export class JobWidget extends Widget { const res:RequestResult = await getJobs(this._state, this._username); if(res.ok){ let json_response:any = res.json(); - INotification.success("Get user jobs success."); if (json_response['status_code'] === 200){ + INotification.success("Get user jobs success."); + me._table = json_response['table']; JOBS = json_response['jobs']; DISPLAYS = json_response['displays']; diff --git a/edsc_extension/edsc_extension/handlers.py b/edsc_extension/edsc_extension/handlers.py index e0e6c306..e0a3dc8a 100644 --- a/edsc_extension/edsc_extension/handlers.py +++ b/edsc_extension/edsc_extension/handlers.py @@ -3,9 +3,25 @@ import nbformat from notebook.base.handlers import IPythonHandler import subprocess +import functools +import json import maap from maap.maap import MAAP +@functools.lru_cache(maxsize=128) +def get_maap_config(host): + path_to_json = os.path.join(os.path.dirname(os.path.abspath(__file__)), '../..', 'maap_environments.json') + + with open(path_to_json) as f: + data = json.load(f) + + match = next((x for x in data if host in x['ade_server']), None) + maap_config = next((x for x in data if x['default_host'] == True), None) if match is None else match + + return maap_config + +def maap_api(host): + return get_maap_config(host)['api_server'] class GetGranulesHandler(IPythonHandler): def printUrls(self, granules): @@ -18,7 +34,7 @@ def printUrls(self, granules): def get(self): - maap = MAAP() + maap = MAAP(maap_api(self.request.host)) cmr_query = self.get_argument('cmr_query', '') limit = str(self.get_argument('limit', '')) print("cmr_query", cmr_query) @@ -35,7 +51,7 @@ def get(self): class GetQueryHandler(IPythonHandler): def get(self): - maap = MAAP() + maap = MAAP(maap_api(self.request.host)) cmr_query = self.get_argument('cmr_query', '') limit = str(self.get_argument('limit', '')) query_type = self.get_argument('query_type', 'granule') diff --git a/entrypoint.sh b/entrypoint.sh index d2bed004..2866cb32 100755 --- a/entrypoint.sh +++ b/entrypoint.sh @@ -1,29 +1,95 @@ -#!/bin/bash - -# Reconstruct Che preview url -export PREVIEW_URL=`/usr/bin/printenv | grep $MACHINE_NAME | perl -slane 'if(/^(SERVER[^_]+_$mn)_SERVICE_PORT=(\d+)/) { $p=$2; print "/".lc($1=~s/_/-/rg)."/server-".$p; }' -- -mn=$MACHINE_NAME | uniq` +#!/bin/bash -# Find the jupyter install in conda (because jlab may be installed in python3.6 or python3.7) +# A more robust method of constructing the workspace url prefix +get_workspace_url_prefix() { + python - <-che" + if test -z "$PREVIEW_URL" + then + PREVIEW_URL=$(get_workspace_url_prefix "$CHE_WORKSPACE_NAMESPACE-$CHE_WORKSPACE_ID") # Che 7 UAT configuration fallback where the default namespace is "-", this will be deprecated + fi + if test -z "$PREVIEW_URL" + then + PREVIEW_URL=$(get_workspace_url_prefix "$CHE_WORKSPACE_ID") # Che 6 configuration fallback where the default namespace is the workspace id, this will be deprecated + fi + if ! test -z "$PREVIEW_URL" # exit loop when it has a preview_url + then + break + fi +done + +# Fix Jupyterlab for Che in `single-host` mode. In `single-host` mode, Che uses URL path prefixes +# to distinguish workspaces. So for example, `https:///work/space/path/`. +# Because of this, we need to set Jupyter's `base_url` to `/work/space/path` so that all hrefs +# and links to files have the correct path prefix. HOWEVER, when the browser accesses those paths, +# the ingress/nginx proxy strips out the `base_url`! Even if the browser makes a request to `/work/space/path/lab`, +# Jupyter's web server, Tornado, only see a request for `/lab`. Tornado routes calls to the correct handler by +# matching the path against a known regular expression. Because `base_url` is configured to `/work/space/path`, +# Tornado only handles requests that have that path prefix, causing calls such as `/lab` to fail. The fix below +# allows `base_url` to be set so that HTTP *responses* include the `base_url` so that browsers make the correct +# call. However, it strips out `base_url` when listening for *requests* so that handles the ingress/nginx proxy +# requests correctly. export NOTEBOOKLIBPATH=`find /opt/conda/lib/ -maxdepth 3 -type d -name "notebook"` -export JUPYTERLABLIBPATH=`find /opt/conda/lib/ -maxdepth 3 -type d -name "jupyterlab"` -export JUPYTERLABSERVERLIBPATH=`find /opt/conda/lib/ -maxdepth 3 -type d -name "jupyterlab_server"` - -# Fixes to JupyterLab with base_url and Che proxying. Hack. -perl -pi -e "s|nb_app.web_app.settings\['base_url'\]|'/'|g" $NOTEBOOKLIBPATH/terminal/__init__.py -perl -pi -e "s|webapp.settings\['base_url'\]|'/'|g" $NOTEBOOKLIBPATH/terminal/__init__.py -perl -pi -e "s|web_app.settings\['base_url'\]|'/'|g" $JUPYTERLABLIBPATH/extension.py -perl -pi -e "s|web_app.settings\['base_url'\]|'/'|g" $JUPYTERLABSERVERLIBPATH/handlers.py -perl -pi -e "s|pattern = url_path_join\(settings\['base_url'\], handler\[0\]\)|pattern = url_path_join('/', handler[0])|g" $NOTEBOOKLIBPATH/notebookapp.py -perl -pi -e "s|web_app.settings\['base_url'\]|'/'|g" /maap-jupyter-ide/pull_projects/pull_projects/__init__.py -perl -pi -e "s|web_app.settings\['base_url'\]|'/'|g" /maap-jupyter-ide/show_ssh_info/show_ssh_info/__init__.py -perl -pi -e "s|web_app.settings\['base_url'\]|'/'|g" /maap-jupyter-ide/edsc_extension/edsc_extension/__init__.py -#perl -pi -e "s|web_app.settings\['base_url'\]|'/'|g" /jupyterlab_iframe/jupyterlab_iframe/__init__.py -#perl -pi -e "s|web_app.settings\['base_url'\]|'/'|g" /inject_ssh/inject_ssh/__init__.py -perl -pi -e "s|web_app.settings\[\"base_url\"\]|'/'|g" /maap-jupyter-ide/jupyterlab-git/jupyterlab_git/handlers.py -perl -pi -e "s|web_app.settings\['base_url'\]|'/'|g" /maap-jupyter-ide/submit_jobs/submit_jobs/__init__.py -perl -pi -e "s|web_app.settings\['base_url'\]|'/'|g" /maap-jupyter-ide/ipycmc/ipycmc/nbextension/__init__.py -perl -pi -e "s|web_app.settings\['base_url'\]|'/'|g" /maap-jupyter-ide/maapsec/maapsec/__init__.py +export JUPYTERSERVERLIBPATH=`find /opt/conda/lib -maxdepth 3 -type d -name "jupyter_server"` + +read -r -d '' JUPYTER_PATCH << EOM + # Fix for Tornado's inability to handle proxy requests + from tornado.routing import _RuleList + def fix_handlers(self, handlers: _RuleList, base_url: str): + for i in range(len(handlers)): + l = list(handlers[i]) + l[0] = l[0].replace(base_url.rstrip('/'), '') + handlers[i] = tuple(l) + return handlers + + def add_handlers(self, host_pattern: str, host_handlers: _RuleList) -> None: + super().add_handlers(host_pattern, self.fix_handlers(host_handlers, self.settings['base_url'])) +EOM +if [[ -f "$JUPYTERSERVERLIBPATH/serverapp.py" ]]; then + perl -pi -e "s|(.*)\(web.Application\):|\$1\(web.Application\):\n$JUPYTER_PATCH|g" "$JUPYTERSERVERLIBPATH/serverapp.py" + perl -pi -e 's|(.*)__init__\(handlers(.*)|$1__init__\(self.fix_handlers\(handlers, base_url\)$2|g' "$JUPYTERSERVERLIBPATH/serverapp.py" +fi + +if [[ -f "$NOTEBOOKLIBPATH/notebookapp.py" ]]; then + perl -pi -e "s|(.*)\(web.Application\):|\$1\(web.Application\):\n$JUPYTER_PATCH|g" "$NOTEBOOKLIBPATH/notebookapp.py" + perl -pi -e 's|(.*)__init__\(handlers(.*)|$1__init__\(self.fix_handlers\(handlers, base_url\)$2|g' "$NOTEBOOKLIBPATH/notebookapp.py" +fi # Dump all env variables into file so they exist still though SSH env | grep _ >> /etc/environment @@ -32,4 +98,17 @@ env | grep _ >> /etc/environment export PATH=$PATH:/opt/conda/bin cp /root/.bashrc ~/.bash_profile -jupyter lab --ip=0.0.0.0 --port=3100 --allow-root --NotebookApp.token='' --LabApp.base_url=$PREVIEW_URL --no-browser --debug +# Need to fix directory permissions for publickey authentication +chmod 700 /projects +mkdir -p /projects/.ssh/ +chmod 700 /projects/.ssh/ +service ssh start + +VERSION=$(jupyter lab --version) +if [[ $VERSION > '2' ]] && [[ $VERSION < '3' ]]; then + jupyter lab --ip=0.0.0.0 --port=3100 --allow-root --NotebookApp.token='' --NotebookApp.base_url=$PREVIEW_URL --no-browser --debug +elif [[ $VERSION > '3' ]] && [[ $VERSION < '4' ]]; then + jupyter lab --ip=0.0.0.0 --port=3100 --allow-root --ServerApp.token='' --ServerApp.base_url=$PREVIEW_URL --no-browser --debug +else + echo "Error!" +fi diff --git a/insert_defaults_to_notebook/src/index.ts b/insert_defaults_to_notebook/src/index.ts index 63b82b7d..ffb3c485 100644 --- a/insert_defaults_to_notebook/src/index.ts +++ b/insert_defaults_to_notebook/src/index.ts @@ -1,79 +1,129 @@ import { JupyterFrontEnd, JupyterFrontEndPlugin } from '@jupyterlab/application'; +import { ICommandPalette } from '@jupyterlab/apputils'; import { IDisposable, DisposableDelegate } from '@lumino/disposable'; import { ToolbarButton } from '@jupyterlab/apputils'; import { DocumentRegistry } from '@jupyterlab/docregistry'; import { NotebookActions, NotebookPanel, INotebookModel } from '@jupyterlab/notebook'; import { ElementExt } from '@lumino/domutils'; import { INotification } from "jupyterlab_toastify"; +import { PageConfig } from '@jupyterlab/coreutils' +import { request, RequestResult } from './request'; import '../style/index.css'; -const DEFAULT_CODE = 'from maap.maap import MAAP\n' + +let DEFAULT_CODE = 'from maap.maap import MAAP\n' + 'maap = MAAP()\n\n' + 'import ipycmc\n' + 'w = ipycmc.MapCMC()\n' + 'w'; +let api_server = ''; +var valuesUrl = new URL(PageConfig.getBaseUrl() + 'maapsec/environment'); + +request('get', valuesUrl.href).then((res: RequestResult) => { + if (res.ok) { + let environment = JSON.parse(res.data); + api_server = environment['api_server']; + DEFAULT_CODE = 'from maap.maap import MAAP\n' + + 'maap = MAAP(maap_host=\'' + api_server + '\')\n\n' + + 'import ipycmc\n' + + 'w = ipycmc.MapCMC()\n' + + 'w'; + } +}); + /** * A notebook widget extension that adds a button to the toolbar. */ -export -class ButtonExtension implements DocumentRegistry.IWidgetExtension { - /** - * Create a new extension object. - */ - createNew(panel: NotebookPanel, context: DocumentRegistry.IContext): IDisposable { - let callback = () => { +export class ButtonExtension implements DocumentRegistry.IWidgetExtension { + /** + * Create a new extension object. + */ + createNew(panel: NotebookPanel, context: DocumentRegistry.IContext): IDisposable { + let callback = () => { - // Select the first cell of the notebook - panel.content.activeCellIndex = 0; - panel.content.deselectAll(); - ElementExt.scrollIntoViewIfNeeded( - panel.content.node, - panel.content.activeCell.node - ); + // Select the first cell of the notebook + panel.content.activeCellIndex = 0; + panel.content.deselectAll(); + ElementExt.scrollIntoViewIfNeeded( + panel.content.node, + panel.content.activeCell.node + ); - // Check if already there - if (panel.content.activeCell.model.value.text == DEFAULT_CODE) { - INotification.error("MAAP defaults already imported to notebook."); - } - else { - // Insert code above selected first cell - NotebookActions.insertAbove(panel.content); - panel.content.activeCell.model.value.text = DEFAULT_CODE; - } + // Check if already there + if (panel.content.activeCell.model.value.text == DEFAULT_CODE) { + INotification.error("MAAP defaults already imported to notebook."); + } + else { + // Insert code above selected first cell + NotebookActions.insertAbove(panel.content); + panel.content.activeCell.model.value.text = DEFAULT_CODE; + } - }; + }; - let button = new ToolbarButton({ - className: 'myButton', - iconClass: 'jp-MaapIcon foo jp-Icon jp-Icon-16 jp-ToolbarButtonComponent-icon', - onClick: callback, - tooltip: 'Import MAAP Libraries' - }); + let button = new ToolbarButton({ + className: 'myButton', + iconClass: 'jp-MaapIcon foo jp-Icon jp-Icon-16 jp-ToolbarButtonComponent-icon', + onClick: callback, + tooltip: 'Import MAAP Libraries' + }); - panel.toolbar.insertItem(0,'insertDefaults', button); - return new DisposableDelegate(() => { - button.dispose(); - }); - } + panel.toolbar.insertItem(0,'insertDefaults', button); + return new DisposableDelegate(() => { + button.dispose(); + }); + } } /** * Activate the extension. */ -function activate(app: JupyterFrontEnd) { - app.docRegistry.addWidgetExtension('Notebook', new ButtonExtension()); - console.log("insert defaults to notebook extension activated"); +function activateNbDefaults(app: JupyterFrontEnd) { + app.docRegistry.addWidgetExtension('Notebook', new ButtonExtension()); + console.log("insert defaults to notebook extension activated"); }; +function hidePanels() { + const leftPanelParent = document.querySelector('.p-TabBar-content'); + const tabsPanel = leftPanelParent.querySelector('li[title="Open Tabs"]'); + console.log('leftPanelParent'); + console.log(leftPanelParent); + console.log('tabsPanel'); + console.log(tabsPanel); + if (tabsPanel != null){ + leftPanelParent.removeChild(tabsPanel); + } + console.log('removing panel!'); +} /** * Initialization data for the insert_defaults_to_notebook extension. */ -const extension: JupyterFrontEndPlugin = { - id: 'insert_defaults_to_notebook', - autoStart: true, - activate: activate +const extensionNbDefaults: JupyterFrontEndPlugin = { + id: 'insert_defaults_to_notebook', + autoStart: true, + activate: activateNbDefaults +}; + +const extensionHidePanels: JupyterFrontEndPlugin = { + id: 'hide_unused_panels', + autoStart: true, + requires: [ICommandPalette], + activate: (app: JupyterFrontEnd, palette: ICommandPalette) => { + const open_command = 'defaults:removePanel'; + + app.commands.addCommand(open_command, { + label: 'Hide Tabs', + isEnabled: () => true, + execute: args => { + hidePanels(); + } + }); + + palette.addItem({command:open_command,category:'User'}); + hidePanels(); // automatically call function at startup + console.log('remove panels activated'); + } }; -export default extension; +export default [extensionNbDefaults, extensionHidePanels]; diff --git a/insert_defaults_to_notebook/src/request.ts b/insert_defaults_to_notebook/src/request.ts new file mode 100644 index 00000000..b80bda6d --- /dev/null +++ b/insert_defaults_to_notebook/src/request.ts @@ -0,0 +1,113 @@ +/** + * This Source Code Form is subject to the terms of the Mozilla Public + * License, v. 2.0. If a copy of the MPL was not distributed with this + * file, You can obtain one at http://mozilla.org/MPL/2.0/. + */ +export interface RequestOptions { + ignoreCache?: boolean; + headers?: {[key: string]:string}; + // 0 (or negative) to wait forever + timeout?: number; + } + + export const DEFAULT_REQUEST_OPTIONS = { + ignoreCache: false, + headers: { + Accept: 'application/json, text/javascript, text/plain' + }, + // default max duration for a request in ms + // currently set to 120s = 2min + timeout: 5000, + }; + + + export interface RequestResult { + ok: boolean; + status: number; + statusText: string; + data: string; + json: () => T; + headers: string; + url: string; + } + + function queryParams(params: any = {}) { + return Object.keys(params) + .map(k => encodeURIComponent(k) + '=' + encodeURIComponent(params[k])) + .join('&'); + } + + function withQuery(url: string, params: any = {}) { + const queryString = queryParams(params); + return queryString ? url + (url.indexOf('?') === -1 ? '?' : '&') + queryString : url; + } + + function parseXHRResult(xhr: XMLHttpRequest): RequestResult { + return { + ok: xhr.status >= 200 && xhr.status < 300, + status: xhr.status, + statusText: xhr.statusText, + headers: xhr.getAllResponseHeaders(), + data: xhr.responseText, + json: () => JSON.parse(xhr.responseText) as T, + url: xhr.responseURL + }; + } + + function errorResponse(xhr: XMLHttpRequest, message: string | null = null): RequestResult { + return { + ok: false, + status: xhr.status, + statusText: xhr.statusText, + headers: xhr.getAllResponseHeaders(), + data: message || xhr.statusText, + json: () => JSON.parse(message || xhr.statusText) as T, + url: xhr.responseURL + }; + } + + export function request(method: 'get' | 'post', + url: string, + queryParams: any = {}, + body: any = null, + options: RequestOptions = DEFAULT_REQUEST_OPTIONS) { + + const ignoreCache = options.ignoreCache || DEFAULT_REQUEST_OPTIONS.ignoreCache; + const headers = options.headers || DEFAULT_REQUEST_OPTIONS.headers; + const timeout = options.timeout || DEFAULT_REQUEST_OPTIONS.timeout; + + return new Promise((resolve, reject) => { + const xhr = new XMLHttpRequest(); + xhr.open(method, withQuery(url, queryParams)); + + if (headers) { + Object.keys(headers).forEach(key => xhr.setRequestHeader(key, headers[key])); + } + + if (ignoreCache) { + xhr.setRequestHeader('Cache-Control', 'no-cache'); + } + + xhr.timeout = timeout; + + xhr.onload = evt => { + resolve(parseXHRResult(xhr)); + }; + + xhr.onerror = evt => { + resolve(errorResponse(xhr, 'Failed to make request.')); + }; + + xhr.ontimeout = evt => { + resolve(errorResponse(xhr, 'Request took longer than expected.')); + }; + + if (method === 'post' && body) { + xhr.setRequestHeader('Content-Type', 'application/json'); + xhr.send(JSON.stringify(body)); + } else { + xhr.send(); + } + }); + } + \ No newline at end of file diff --git a/maap_environments.json b/maap_environments.json index 3861127e..59431d65 100644 --- a/maap_environments.json +++ b/maap_environments.json @@ -24,11 +24,21 @@ "ade_server": "ade.uat.maap-project.org", "api_server": "api.uat.maap-project.org", "auth_server": "auth.uat.maap-project.org", - "mas_server": "mas.uat.maap-project.org", + "mas_server": "repo.uat.maap-project.org", "edsc_server": "ade.uat.maap-project.org:30052", "workspace_bucket": "maap-uat-workspace", "default_host": false }, + { + "environment": "pre-ops", + "ade_server": "ade.ops.maap-project.org", + "api_server": "api.ops.maap-project.org", + "auth_server": "auth.ops.maap-project.org", + "mas_server": "mas.ops.maap-project.org", + "edsc_server": "ade.ops.maap-project.org:30052", + "workspace_bucket": "maap-ops-workspace", + "default_host": false + }, { "environment": "ops", "ade_server": "ade.maap-project.org", diff --git a/pull_projects/pull_projects/handlers.py b/pull_projects/pull_projects/handlers.py index 6f198463..1a224660 100644 --- a/pull_projects/pull_projects/handlers.py +++ b/pull_projects/pull_projects/handlers.py @@ -110,21 +110,26 @@ def get(self): verify=False ) try: - resp = json.loads(r.text) # JSON response to dict - project_list = resp['config']['projects'] # gets list of projects, each is dict with project properties + resp = json.loads(r.text) + + try: + project_list = resp['devfile']['projects'] # gets list of projects, each is dict with project properties + except KeyError: + project_list = [] + self.finish({"status": "no projects to import"}) # get projects + for project in project_list: project_name = project['name'] - path = project['path'] src_type = project['source']['type'] location = project['source']['location'] dl_loc = '/projects/'+project_name if src_type == 'git': - if not os.path.exists('/projects'+path): + if not os.path.exists('/projects/'+project_name): # Check if is stored on our gitlab (e.g. mas.maap-project.org) if so, use the users authentication # token to allow for the downloads of private repositories diff --git a/show_ssh_info/show_ssh_info/__init__.py b/show_ssh_info/show_ssh_info/__init__.py index ef327b77..2edb439a 100644 --- a/show_ssh_info/show_ssh_info/__init__.py +++ b/show_ssh_info/show_ssh_info/__init__.py @@ -2,7 +2,7 @@ import os import os.path from notebook.utils import url_path_join -from .handlers import GetHandler, CheckInstallersHandler, InstallHandler, InjectKeyHandler, MountBucketHandler, Presigneds3UrlHandler, MountOrgBucketsHandler, MountSharedBucketsHandler +from .handlers import GetHandler, InjectKeyHandler, Presigneds3UrlHandler def _jupyter_server_extension_paths(): return [{ @@ -24,11 +24,6 @@ def load_jupyter_server_extension(nb_server_app): print('base_url is '+base_url) web_app.add_handlers(host_pattern, [(url_path_join(base_url, 'show_ssh_info/get'), GetHandler)]) - web_app.add_handlers(host_pattern, [(url_path_join(base_url, 'show_ssh_info/checkInstallers'), CheckInstallersHandler)]) - web_app.add_handlers(host_pattern, [(url_path_join(base_url, 'show_ssh_info/install'), InstallHandler)]) web_app.add_handlers(host_pattern, [(url_path_join(base_url, 'show_ssh_info/inject_public_key'), InjectKeyHandler)]) - web_app.add_handlers(host_pattern, [(url_path_join(base_url, 'show_ssh_info/mountBucket'), MountBucketHandler)]) - web_app.add_handlers(host_pattern, [(url_path_join(base_url, 'show_ssh_info/mountSharedBucket'), MountSharedBucketsHandler)]) - web_app.add_handlers(host_pattern, [(url_path_join(base_url, 'show_ssh_info/getOrgs'), MountOrgBucketsHandler)]) web_app.add_handlers(host_pattern, [(url_path_join(base_url, 'show_ssh_info/getSigneds3Url'), Presigneds3UrlHandler)]) diff --git a/show_ssh_info/show_ssh_info/handlers.py b/show_ssh_info/show_ssh_info/handlers.py index 8ee30560..07e89ed0 100644 --- a/show_ssh_info/show_ssh_info/handlers.py +++ b/show_ssh_info/show_ssh_info/handlers.py @@ -84,101 +84,42 @@ class GetHandler(IPythonHandler): """ def get(self): - # Get Port from Kubernetes - host = os.environ.get('KUBERNETES_SERVICE_HOST') - host_port = os.environ.get('KUBERNETES_PORT_443_TCP_PORT') - workspace_id = os.environ.get('CHE_WORKSPACE_ID') - - with open ("/var/run/secrets/kubernetes.io/serviceaccount/token", "r") as t: - token=t.read() - - headers = { - 'Authorization': 'Bearer ' + token, - } - - request_string = 'https://' + host + ':' + host_port + '/api/v1/namespaces/' + workspace_id + '/services/ws' - response = requests.get(request_string, headers=headers, verify=False) - - data = response.json() - port = data['spec']['ports'][0]['nodePort'] - - - # Get external IP address - ip = get('https://api.ipify.org').text - - self.finish({'ip': ip, 'port': port}) - return - -class CheckInstallersHandler(IPythonHandler): - """ - Check if SSH and exec Che Installers are enabled. If they are not, a user would not be able to ssh in becuase there - would be no SSH agent. - """ - def get(self): - # - # TODO: DELTE THIS LINE!!!!! IT MAKES THE CHECK NOT HAPPEN!!! - # - # self.finish({'status': True}) - - che_machine_token = os.environ['CHE_MACHINE_TOKEN'] - url = '{}/api/workspace/{}'.format(maap_ade_url(self.request.host), os.environ.get('CHE_WORKSPACE_ID')) - # -------------------------------------------------- - # TODO: FIGURE OUT AUTH KEY & verify - # -------------------------------------------------- - headers = { - 'Accept': 'application/json', - 'Authorization': 'Bearer {token}'.format(token=che_machine_token) - } - r = requests.get( - url, - headers=headers, - verify=False - ) - - resp = json.loads(r.text) # JSON response to dict - installers = resp['config']['environments']["default"]["machines"]["ws/jupyter"]['installers'] - # Check installers - if 'org.eclipse.che.ssh' in installers and 'org.eclipse.che.exec' in installers: - self.finish({'status': True}) - else: - self.finish({'status': False}) - -class InstallHandler(IPythonHandler): - """ - Update workspace config to enable SSH and exec installers. Not sure if the workspace has to be maunually restarted - at this point or if I can restart it. - """ - def get(self): - - che_machine_token = os.environ['CHE_MACHINE_TOKEN'] - url = '{}/api/workspace/{}'.format(maap_ade_url(self.request.host), os.environ.get('CHE_WORKSPACE_ID')) - # -------------------------------------------------- - # TODO: FIGURE OUT AUTH KEY & verify - # -------------------------------------------------- - headers = { - 'Accept': 'application/json', - 'Authorization': 'Bearer {token}'.format(token=che_machine_token) - } - r = requests.get( - url, - headers=headers, - verify=False - ) - - installers = ['org.eclipse.che.ssh', 'org.eclipse.che.exec'] - workspace_config = json.loads(r.text) # JSON response to dict - - # Update workspace config with new installers - workspace_config['config']['environments']["default"]["machines"]["ws/jupyter"]['installers'] = installers - - r = requests.put( - url, - headers=headers, - verify=False - ) + try: + svc_host = os.environ.get('KUBERNETES_SERVICE_HOST') + svc_host_https_port = os.environ.get('KUBERNETES_SERVICE_PORT_HTTPS') + namespace = os.environ.get('CHE_WORKSPACE_NAMESPACE') + '-che' + che_workspace_id = os.environ.get('CHE_WORKSPACE_ID') + sshport_name = 'sshport' + + ip = requests.get('https://api.ipify.org').text + + with open ("/var/run/secrets/kubernetes.io/serviceaccount/token", "r") as t: + token=t.read() + + headers = { + 'Authorization': 'Bearer ' + token, + } + + request_string = 'https://' + svc_host + ':' + svc_host_https_port + '/api/v1/namespaces/' + namespace + '/services/' + response = requests.get(request_string, headers=headers, verify=False) + data = response.json() + endpoints = data['items'] + + # Ssh service is running on a seperate container from the user workspace. Query the kubernetes host service to find the container where the nodeport has been set. + for endpoint in endpoints: + if sshport_name in endpoint['metadata']['name']: + if che_workspace_id == endpoint['metadata']['labels']['che.workspace_id']: + port = endpoint['spec']['ports'][0]['nodePort'] + self.finish({'ip': ip, 'port': port}) + + self.finish({"status": 500, "message": "failed to get ip and port"}) + except: + self.finish({"status": 500, "message": "failed to get ip and port"}) - self.finish(r.status_code) +""" +No longer in use. Mounting now is happening outside of the Jupyter container. +""" class MountBucketHandler(IPythonHandler): def get(self): message = '' @@ -227,6 +168,7 @@ def get(self): # mount whole bucket first mount_output = subprocess.check_output('s3fs -o iam_role=auto -o imdsv1only -o use_cache=/tmp/cache {} {}'.format(bucket,user_workspace), shell=True).decode('utf-8') + message = mount_output logging.debug('mount log {}'.format(mount_output)) @@ -253,6 +195,7 @@ def get(self): logging.debug('umount output {}'.format(umount_output)) mountdir_output = subprocess.check_output('s3fs -o iam_role=auto -o imdsv1only -o use_cache=/tmp/cache {} {}'.format(user_bucket_dir,user_workspace), shell=True).decode('utf-8') + message = mountdir_output logging.debug('mountdir output {}'.format(mountdir_output)) @@ -260,6 +203,10 @@ def get(self): except: self.finish({"status_code":500, "message":message, "user_workspace":user_workspace,"user_bucket_dir":user_bucket_dir}) + +""" +No longer in use. Mounting now is happening outside of the Jupyter container. +""" class MountSharedBucketsHandler(IPythonHandler): def get(self): message = '' @@ -311,11 +258,14 @@ def get(self): except: self.finish({"status_code":500, "message":message, "shared_workspaces":shared_workspaces}) +""" +No longer in use. Mounting now is happening outside of the Jupyter container. +""" class MountOrgBucketsHandler(IPythonHandler): def get(self): # Send request to Che API for list of user's orgs # ts pass keycloak token from window - token = self.get_argument('token','') + token = self.get_argument('token', '') bucket = dps_bucket_name(self.request.host) url = '{}/api/organization'.format(maap_ade_url(self.request.host)) headers = { @@ -325,8 +275,8 @@ def get(self): try: # send request resp = requests.get( - url, - headers=headers, + url, + headers=headers, verify=False ) logging.debug(resp) @@ -421,25 +371,62 @@ def get(self): self.finish({"status_code":resp.status_code, "message":"error requesting Che organizations", "org_workspaces":[],"org_bucket_dirs":[]}) class Presigneds3UrlHandler(IPythonHandler): + def get(self): # get arguments bucket = dps_bucket_name(self.request.host) - key = self.get_argument('key','') - proxyTicket = self.get_argument('proxy-ticket','') - logging.debug('bucket is '+bucket) - logging.debug('key is '+key) - - expiration = '43200' # 12 hrs in seconds - logging.debug('expiration is {} seconds'+expiration) - - url = '{}/api/members/self/presignedUrlS3/{}/{}?exp={}'.format(maap_api_url(self.request.host), bucket, key, expiration) - - headers = {'Accept': 'application/json', 'proxy-ticket': proxyTicket} + key = self.get_argument('key', '') + rt_path = os.path.expanduser(self.get_argument('home_path', '')) + abs_path = os.path.join(rt_path, key) + proxy_ticket = self.get_argument('proxy-ticket','') + expiration = self.get_argument('duration','86400') # default 24 hrs + che_ws_namespace = os.environ.get('CHE_WORKSPACE_NAMESPACE') + + logging.debug('bucket is '+bucket) + logging.debug('key is '+key) + logging.debug('full path is '+abs_path) + + # ----------------------- + # Checking for bad input + # ----------------------- + # if directory, return error - dirs not supported + if os.path.isdir(abs_path): + self.finish({"status_code": 412, "message": "error", "url": "Presigned S3 links do not support folders"}) + return + + # check if file in valid folder (under mounted folder path) + resp = subprocess.check_output("df -h | grep s3fs | awk '{print $6}'", shell=True).decode('utf-8') + mounted_dirs = resp.strip().split('\n') + logging.debug(mounted_dirs) + if len(mounted_dirs) == 0: + self.finish({"status_code": 412, "message": "error", + "url": "Presigned S3 links can only be created for files in a mounted org or user folder" + + "\nMounted folders include:\n{}".format(resp) + }) + return + + if not any([mounted_dir in abs_path for mounted_dir in mounted_dirs]): + self.finish({"status_code": 412, "message": "error", + "url": "Presigned S3 links can only be created for files in a mounted org or user folder" + + "\nMounted folders include:\n{}".format(resp) + }) + return + + # ----------------------- + # Generate S3 Link + # ----------------------- + # if valid path, get presigned URL + # expiration = '43200' # 12 hrs in seconds + logging.debug('expiration is {} seconds', expiration) + + url = '{}/api/members/self/presignedUrlS3/{}/{}?exp={}&ws={}'.format(maap_api_url(self.request.host), bucket, key, expiration, che_ws_namespace) + headers = {'Accept': 'application/json', 'proxy-ticket': proxy_ticket} r = requests.get( url, headers=headers, verify=False ) + logging.debug(r.text) resp = json.loads(r.text) self.finish({"status_code":200, "message": "success", "url":resp['url']}) diff --git a/show_ssh_info/src/dialogs.ts b/show_ssh_info/src/dialogs.ts new file mode 100644 index 00000000..7d1bab17 --- /dev/null +++ b/show_ssh_info/src/dialogs.ts @@ -0,0 +1,89 @@ +import { Dialog } from '@jupyterlab/apputils'; + +const notImplemented: string[] = []; + +export class DialogEnter extends Dialog { + /** + * Create a dialog panel instance. + * + * @param options - The dialog setup options. + */ + constructor(options: Partial> = {}) { + super(options); + } + + handleEvent(event: Event): void { + switch (event.type) { + case 'keydown': + this._evtKeydown(event as KeyboardEvent); + break; + case 'click': + this._evtClick(event as MouseEvent); + break; + case 'focus': + this._evtFocus(event as FocusEvent); + break; + case 'contextmenu': + event.preventDefault(); + event.stopPropagation(); + break; + default: + break; + } + } + + protected _evtKeydown(event: KeyboardEvent): void { + // Check for escape key + switch (event.keyCode) { + case 13: // Enter. + //event.stopPropagation(); + //event.preventDefault(); + //this.resolve(); + break; + default: + super._evtKeydown(event); + break; + } + } +} + +export function showDialogEnter( + options: Partial> = {} +): void { + let dialog = new DialogEnter(options); + dialog.launch(); + // setTimeout(function(){console.log('go away'); dialog.resolve(0);}, 3000); + return; +} + +export function popup(b:any): void { + if ( !(notImplemented.includes(b.req) )){ + popupTitle(b,b.popupTitle); + } else { + console.log("not implemented yet"); + popupResult("Not Implemented yet","Not Implemented yet") + } +} + +export function popupTitle(b:any,popupTitle:string): void { + showDialogEnter({ + title: popupTitle, + body: b, + focusNodeSelector: 'input', + buttons: [Dialog.okButton({ label: 'Ok' }), Dialog.cancelButton({ label : 'Cancel'})] + }); +} + +export function popupResult(b:any,popupTitle:string): void { + showDialogEnter({ + title: popupTitle, + body: b, + focusNodeSelector: 'input', + buttons: [Dialog.okButton({ label: 'Ok' })] + }); +} + +export function isEmpty(obj:any) { + return Object.keys(obj).length === 0; +} + diff --git a/show_ssh_info/src/funcs.ts b/show_ssh_info/src/funcs.ts index 07d75d35..62818a7a 100644 --- a/show_ssh_info/src/funcs.ts +++ b/show_ssh_info/src/funcs.ts @@ -1,77 +1,29 @@ -import {PageConfig} from "@jupyterlab/coreutils"; -import {Dialog, ICommandPalette, showDialog} from "@jupyterlab/apputils"; -import {getToken, getUserInfo, getUserInfoAsyncWrapper} from "./getKeycloak"; -import {INotification} from "jupyterlab_toastify"; -import {JupyterFrontEnd} from "@jupyterlab/application"; -import {IFileBrowserFactory} from "@jupyterlab/filebrowser"; +import { JupyterFrontEnd } from "@jupyterlab/application"; +import { PageConfig } from "@jupyterlab/coreutils"; +import { Dialog, ICommandPalette, showDialog } from "@jupyterlab/apputils"; +import { IFileBrowserFactory } from "@jupyterlab/filebrowser"; import { IStateDB } from '@jupyterlab/statedb'; -import {Widget} from "@lumino/widgets"; +// import { Widget } from "@lumino/widgets"; +import { INotification } from "jupyterlab_toastify"; +import { getToken, getUserInfo, getUserInfoAsyncWrapper } from "./getKeycloak"; +import { SshWidget, UserInfoWidget } from './widgets'; +import { DropdownSelector } from './selector'; +import { popupResult } from './dialogs'; import { request, RequestResult } from './request'; -import { SshWidget, InstallSshWidget, UserInfoWidget } from './widgets' const profileId = 'maapsec-extension:IMaapProfile'; -export function popup(b:Widget,title:string): void { +export async function checkSSH() { showDialog({ - title: title, - body: b, + title: 'SSH Info:', + body: new SshWidget(), focusNodeSelector: 'input', - buttons: [Dialog.okButton({ label: 'Ok' }), Dialog.cancelButton({ label : 'Cancel'})] + buttons: [Dialog.okButton({label: 'Ok'})] }); } -export async function checkSSH() { - // - // Check if SSH and Exec Installers have been activated - // - request('get', PageConfig.getBaseUrl() + "show_ssh_info/checkInstallers") - .then((res: RequestResult) => { - if(res.ok){ - let json_results:any = res.json(); - let status = json_results['status']; - - // - // If installers have been activated, show ssh info - // - if (status) { - showDialog({ - title: 'SSH Info:', - body: new SshWidget(), - focusNodeSelector: 'input', - buttons: [Dialog.okButton({ label: 'Ok' })] - }); - } - - // - // Otherwise, ask the user if they want to enable the installers - // - else { - showDialog({ - title: 'SSH Info:', - body: new InstallSshWidget(), - focusNodeSelector: 'input', - buttons: [Dialog.okButton({ label: 'Ok' }),] - // buttons: [Dialog.okButton({ label: 'Activate SSH' }), Dialog.cancelButton()] - }).then(result => { - if (result.button.label === 'Activate SSH') { - // Make Call To Activate - request('get', PageConfig.getBaseUrl() + "show_ssh_info/install") - // Restart workspace??? - } - // User does not want to activate installers - else { - return; - } - }); - } - - } - }); -} - export function checkUserInfo(): void { getUserInfo(function(profile: any) { - // console.log(profile); if (profile['cas:username'] === undefined) { INotification.error("Get user profile failed."); return; @@ -90,65 +42,20 @@ export function checkUserInfo(): void { }); } -export async function mountUserFolder(state: IStateDB) { - - getUserInfo(function(profile: any) { - // get username from keycloak - if (profile['cas:username'] === undefined) { - INotification.error("Get username failed, did not mount bucket."); - return; - } - // send username to backend to create local mount point and mount s3 bucket - let username = profile['cas:username'] - var getUrl = new URL(PageConfig.getBaseUrl() + 'show_ssh_info/mountBucket'); - getUrl.searchParams.append('username',username); - - request('get', getUrl.href).then((res: RequestResult) => { - if (res.ok) { - let data:any = JSON.parse(res.data); - if (data.status_code == 200) { - let user_workspace = data.user_workspace; - let user_bucket_dir = data.user_bucket_dir; - INotification.success('Mounted user workspace '+user_workspace+' to '+user_bucket_dir); - } else { - INotification.error('Failed to mount user workspace to s3'); - } - } else { - INotification.error('Failed to mount user workspace to s3'); - } - }); - }); -} - -export async function mountOrgFolders(state: IStateDB) { - // do something - let token = getToken(); - var getUrl = new URL(PageConfig.getBaseUrl() + 'show_ssh_info/getOrgs'); - getUrl.searchParams.append('token',token); - request('get', getUrl.href).then((res: RequestResult) => { - if (res.ok) { - let data:any = JSON.parse(res.data); - if (data.status_code == 200) { - console.log(data); - INotification.success('Successfully mounted organization and sub-organization folders') - } else { - INotification.error('Failed to get user\'s Che orgs'); - } - } else { - INotification.error('Failed to get user\'s Che orgs'); - } - }); -} - -export async function getPresignedUrl(state: IStateDB, key:string): Promise { +export async function getPresignedUrl(state: IStateDB, key:string, duration:string): Promise { const profile = await getUsernameToken(state); return new Promise(async (resolve, reject) => { let presignedUrl = ''; + let token = getToken(); var getUrl = new URL(PageConfig.getBaseUrl() + 'show_ssh_info/getSigneds3Url'); - getUrl.searchParams.append('key',key); + getUrl.searchParams.append('home_path', PageConfig.getOption('serverRoot')); + getUrl.searchParams.append('key', key); + getUrl.searchParams.append('username', profile.uname); + getUrl.searchParams.append('token', token); getUrl.searchParams.append('proxy-ticket', profile.ticket); + getUrl.searchParams.append('duration', duration); request('get', getUrl.href).then((res: RequestResult) => { if (res.ok) { let data:any = JSON.parse(res.data); @@ -160,7 +67,7 @@ export async function getPresignedUrl(state: IStateDB, key:string): Promise { - let display = url; - if (url.substring(0,5) == 'https'){ - display = ''+url+''; - } - - let body = document.createElement('div'); - body.style.display = 'flex'; - body.style.flexDirection = 'column'; - - var textarea = document.createElement("div"); - textarea.id = 'result-text'; - textarea.style.display = 'flex'; - textarea.style.flexDirection = 'column'; - textarea.innerHTML = "
"+display+"
"; - - body.appendChild(textarea); - - showDialog({ - title: 'Presigned Url', - body: new Widget({node:body}), - focusNodeSelector: 'input', - buttons: [Dialog.okButton({label: 'Ok'})] - }); - }); + let expirationOptions = ['86400 (24 hours)','604800 (1 week)','2592000 (30 days)']; + let dropdownSelector = new DropdownSelector(expirationOptions, '86400 (24 hours)', state, path); + popupResult(dropdownSelector, 'Select an Expiration Duration'); }, isVisible: () => tracker.currentWidget && diff --git a/show_ssh_info/src/getKeycloak.js b/show_ssh_info/src/getKeycloak.js index 993e8650..98a4758a 100644 --- a/show_ssh_info/src/getKeycloak.js +++ b/show_ssh_info/src/getKeycloak.js @@ -1,12 +1,7 @@ export var getUserInfo = function(callback) { - - console.log(window.parent); window.parent._keycloak.loadUserInfo().success(function(profile) { - console.log(profile); - // key = profile['public_ssh_keys']; callback(profile); - }).error(function() { console.log('Failed to load profile.'); return "error"; diff --git a/show_ssh_info/src/index.ts b/show_ssh_info/src/index.ts index 0dbd0bdc..6fb639b0 100644 --- a/show_ssh_info/src/index.ts +++ b/show_ssh_info/src/index.ts @@ -4,7 +4,7 @@ import { IFileBrowserFactory } from '@jupyterlab/filebrowser'; import { ILauncher } from '@jupyterlab/launcher'; import { IStateDB } from '@jupyterlab/statedb'; -import { checkUserInfo, mountUserFolder, checkSSH, activateGetPresignedUrl, mountOrgFolders} from './funcs' +import { checkUserInfo, checkSSH, activateGetPresignedUrl } from './funcs' import { InjectSSH } from './widgets' import { updateKeycloakToken } from "./getKeycloak"; import '../style/index.css'; @@ -70,55 +70,6 @@ const extensionUser: JupyterFrontEndPlugin = { }; -/////////////////////////////////////////////////////////////// -// -// Mount user workspace extension -// -/////////////////////////////////////////////////////////////// -const extensionMount: JupyterFrontEndPlugin = { - id: 'mount-s3-folder', - autoStart: true, - requires: [ICommandPalette, IStateDB], - optional: [], - activate: (app: JupyterFrontEnd, palette: ICommandPalette, state: IStateDB) => { - const open_command = 'sshinfo:mount'; - - app.commands.addCommand(open_command, { - label: 'User Workspace Mount', - isEnabled: () => true, - execute: args => { - mountUserFolder(state); - } - }); - palette.addItem({command:open_command,category:'User'}); - mountUserFolder(state); // automatically mount user folder on load - } -}; - -/////////////////////////////////////////////////////////////// -// -// Mount org buckets extension -// -/////////////////////////////////////////////////////////////// -const extensionMountOrgBuckets: JupyterFrontEndPlugin = { - id: 'mount-che-org-buckets', - requires: [ICommandPalette, IStateDB], - autoStart: true, - activate: (app: JupyterFrontEnd, palette: ICommandPalette, state: IStateDB) => { - const open_command = 'sshinfo:orgs'; - app.commands.addCommand(open_command, { - label: 'Che Org Workspace Mount', - isEnabled: () => true, - execute: args => { - mountOrgFolders(state); - } - }); - palette.addItem({command:open_command,category:'User'}); - mountOrgFolders(state); - } -}; - - /////////////////////////////////////////////////////////////// // // Presigned URL extension @@ -154,7 +105,4 @@ const extensionRefreshToken: JupyterFrontEndPlugin = { } }; - - - -export default [extensionSsh, extensionUser, extensionMount, extensionPreSigneds3Url, extensionRefreshToken, extensionMountOrgBuckets]; +export default [extensionSsh, extensionUser, extensionPreSigneds3Url, extensionRefreshToken]; diff --git a/show_ssh_info/src/selector.ts b/show_ssh_info/src/selector.ts new file mode 100644 index 00000000..eb4a80c7 --- /dev/null +++ b/show_ssh_info/src/selector.ts @@ -0,0 +1,91 @@ +import { Widget } from "@lumino/widgets"; +import { Clipboard, Dialog, showDialog } from "@jupyterlab/apputils"; +import { IStateDB } from '@jupyterlab/statedb'; +import { getPresignedUrl } from './funcs'; + +export class DropdownSelector extends Widget { + private _dropdown: HTMLSelectElement; + public selected: string; + + constructor(options:string[], private defaultOption:string, private state: IStateDB, public path:string) { + super(); + this._dropdown = document.createElement("SELECT"); + if (! defaultOption) { + this.defaultOption = ''; + } + + let opt:HTMLOptionElement; + for (let option of options) { + opt = document.createElement("option"); + if (this.defaultOption === option) { + opt.setAttribute("selected","selected"); + } + opt.setAttribute("id", option); + opt.setAttribute("label",option); + opt.appendChild(document.createTextNode(option)); + this._dropdown.appendChild(opt); + } + this.node.appendChild(this._dropdown); + } + + getValue() { + this.selected = this._dropdown.value; + let ind = this.selected.indexOf('('); + if (ind > -1) { + this.selected = this.selected.substr(0,ind).trim(); + } + + // guarantee default value + if (this.selected == null || this.selected == '') { + this.selected = this.defaultOption; + console.log('no option selected, using '+this.defaultOption); + } + console.log(this.selected); + + // send request to get url + getPresignedUrl(this.state, this.path, this.selected).then((url:string) => { + let display = url; + let validUrl = false; + if (url.substring(0,5) == 'https'){ + validUrl = true; + display = 'Link will expire in '+this._dropdown.value+'
'; + display = display + ''+url+''; + } else { + display = url + } + + let body = document.createElement('div'); + body.style.display = 'flex'; + body.style.flexDirection = 'column'; + + let textarea = document.createElement("div"); + textarea.id = 'result-text'; + textarea.style.display = 'flex'; + textarea.style.flexDirection = 'column'; + textarea.innerHTML = "
"+display+"
"; + + body.appendChild(textarea); + + // Copy URL to clipboard button if url created + if (validUrl){ + let copyBtn = document.createElement('button'); + copyBtn.id = 's3-link-copy-button'; + copyBtn.className = 'jupyter-button'; + copyBtn.innerHTML = 'Copy Link'; + copyBtn.style.width = "200px"; + copyBtn.addEventListener('click', function() { + Clipboard.copyToSystem(url); + }, false); + + body.appendChild(copyBtn); + } + + showDialog({ + title: 'Presigned Url', + body: new Widget({node:body}), + focusNodeSelector: 'input', + buttons: [Dialog.okButton({label: 'Ok'})] + }); + }); + } +} \ No newline at end of file diff --git a/show_ssh_info/src/widgets.ts b/show_ssh_info/src/widgets.ts index c62c18ef..1c5f1dbf 100644 --- a/show_ssh_info/src/widgets.ts +++ b/show_ssh_info/src/widgets.ts @@ -26,20 +26,6 @@ class SshWidget extends Widget { } } -export -class InstallSshWidget extends Widget { - constructor() { - let body = document.createElement('div'); - body.style.display = 'flex'; - body.style.flexDirection = 'column'; - - let message = "SSH has not been enabled in your workspace. In order to enable SSH navigate to your workspace admin page. Under the tab Installers, turn on SSH and EXEC and click apply. NOTE: This will restart your workspace and take a few minutes."; - let contents = document.createTextNode(message); - body.appendChild(contents); - super({ node: body }); - } -} - export class UserInfoWidget extends Widget { constructor(username:string,email:string,org:string) { diff --git a/submit_jobs/maap-request-specs.md b/submit_jobs/maap-request-specs.md index df2062c0..48aaa154 100644 --- a/submit_jobs/maap-request-specs.md +++ b/submit_jobs/maap-request-specs.md @@ -52,10 +52,6 @@ url: https://api.maap.xyz/api/mas/algorithm "field": "pass_number", "download": false }, - { - "field": "timestamp", - "download": false - }, { "field": "username", "download": false @@ -128,11 +124,6 @@ url: http://api.maap.xyz/api/dps/job 2 - - - 2018-03-26T00:00:01Z - - ``` diff --git a/submit_jobs/src/fields.json b/submit_jobs/src/fields.json index 63b932ed..6c4b22e8 100644 --- a/submit_jobs/src/fields.json +++ b/submit_jobs/src/fields.json @@ -1,5 +1,5 @@ { - "register": ["algo_name","version","run_command","memory","inputs"], + "register": ["algo_name","version","run_command","queue","inputs"], "deleteAlgorithm": ["algo_id","version","proxy-ticket"], "publishAlgorithm": ["algo_id","version","proxy-ticket"], "getCapabilities": [], diff --git a/submit_jobs/src/selector.ts b/submit_jobs/src/selector.ts index 29dbb8af..2c19f292 100644 --- a/submit_jobs/src/selector.ts +++ b/submit_jobs/src/selector.ts @@ -23,7 +23,6 @@ export class DropdownSelector extends Widget { this._fields = fields; this._username = uname; this._ticket = ticket; - this.selection = ''; this.type = type; this._state = state; diff --git a/submit_jobs/src/widgets.ts b/submit_jobs/src/widgets.ts index ddf90aaf..f22e51f5 100644 --- a/submit_jobs/src/widgets.ts +++ b/submit_jobs/src/widgets.ts @@ -405,7 +405,7 @@ export class RegisterWidget extends InputWidget { this.node.appendChild(fieldInputs); }, 500); - } else if (field === 'memory') { + } else if (field === 'queue') { let fieldLabel = document.createElement('Label'); fieldLabel.innerHTML = field; this.node.appendChild(fieldLabel); @@ -471,7 +471,7 @@ export class RegisterWidget extends InputWidget { let fieldElement:HTMLSelectElement = document.getElementById('queues-dropdown') as HTMLSelectElement; console.log(fieldElement); let opt:string = fieldElement.value; - getUrl.searchParams.append('memory', opt); + getUrl.searchParams.append('queue', opt); console.log(getUrl.href); console.log('done setting url'); urllst.push(getUrl); diff --git a/submit_jobs/submit_jobs/handlers.py b/submit_jobs/submit_jobs/handlers.py index feef5a98..e295e062 100644 --- a/submit_jobs/submit_jobs/handlers.py +++ b/submit_jobs/submit_jobs/handlers.py @@ -18,8 +18,6 @@ logger = logging.getLogger() logger.setLevel(logging.DEBUG) -maap = MAAP() - FILEPATH = os.path.dirname(os.path.abspath(__file__)) WORKDIR = FILEPATH+'/..' # WORKDIR = os.getcwd()+'/../../submit_jobs' @@ -107,8 +105,8 @@ def get_maap_config(host): return maap_config -def maap_api_url(host): - return 'https://{}/api'.format(get_maap_config(host)['api_server']) +def maap_api(host): + return get_maap_config(host)['api_server'] # currently allows repos from both repo.nasa.maap and mas.maap-project class RegisterAlgorithmHandler(IPythonHandler): @@ -117,7 +115,7 @@ def get(self, **params): # Part 1: Parse Required Arguments # ================================== # logging.debug('workdir is '+WORKDIR) - fields = ['config_path', 'memory'] + fields = ['config_path', 'queue'] params = {} for f in fields: try: @@ -141,12 +139,12 @@ def get(self, **params): if config['inputs'] in ['null', None]: config['inputs'] = '' - if 'memory' in params.keys(): - config['memory'] = params['memory'] + if 'queue' in params.keys(): + config['queue'] = params['queue'] # TODO: bug fix needed -- this mangles the input section of the config yaml # Commenting out for now - # # overwrite config yaml with memory + # # overwrite config yaml with queue # config_template = WORKDIR+"/submit_jobs/register.yaml" # new_config = '' # with open(config_template, 'r') as infile: @@ -165,7 +163,7 @@ def get(self, **params): json_file = WORKDIR+"/submit_jobs/register_url.json" # only description and inputs are allowed to be empty - for f in ['algo_name','version','environment','run_command','repository_url','memory','docker_url']: + for f in ['algo_name','version','environment','run_command','repository_url','queue','docker_url']: if config[f] == '' or config[f] == None: self.finish({"status_code": 412, "result": "Error: Register field {} cannot be empty".format(f)}) return @@ -179,7 +177,7 @@ def get(self, **params): logging.debug('repo url is {}'.format(config['repository_url'])) # check if repo is hosted on a MAAP GitLab instance - if (not ('repo.nasa.maap') in config['repository_url']) and (not ('mas.maap-project') in config['repository_url']): + if (not ('repo.nasa.maap') in config['repository_url']) and (not ('maap-project.org') in config['repository_url']): self.finish({"status_code": 412, "result": "Error: Your git repo is not from a supported host (e.g. mas.maap-project.org)"}) return @@ -245,7 +243,7 @@ def get(self, **params): # ================================== # Part 4: Check Response # ================================== - maap = MAAP() + maap = MAAP(maap_api(self.request.host)) try: r = maap.registerAlgorithm(req_json) logging.debug(r.text) @@ -290,7 +288,7 @@ def get(self): # ================================== # Part 2: Build & Send Request (outsourced to maap-py lib) # ================================== - maap = MAAP() + maap = MAAP(maap_api(self.request.host)) if complete: r = maap.deleteAlgorithm('{algo_id}:{version}'.format(**params)) else: @@ -357,7 +355,7 @@ def get(self): # ================================== # Part 2: Build & Send Request (outsourced to maap-py lib) # ================================== - maap = MAAP() + maap = MAAP(maap_api(self.request.host)) r = maap.publishAlgorithm('{algo_id}:{version}'.format(**params)) # print(r.status_code) @@ -406,7 +404,7 @@ def get(self): # ================================== # Part 1: Build & Send Request (outsourced to maap-py lib) # ================================== - maap = MAAP() + maap = MAAP(maap_api(self.request.host)) r = maap.getCapabilities() # ================================== @@ -455,7 +453,7 @@ def args_to_dict(self): def get(self): # outsourced to maap-py lib kwargs = self.args_to_dict() - maap = MAAP() + maap = MAAP(maap_api(self.request.host)) resp = maap.submitJob(**kwargs) logger.debug(resp) status_code = resp['http_status_code'] @@ -491,7 +489,7 @@ def get(self): # ================================== # Part 2: Build & Send Request (outsourced to maap-py lib) # ================================== - maap = MAAP() + maap = MAAP(maap_api(self.request.host)) try: r = maap.getJobStatus(params['job_id']) # print(r.status_code) @@ -553,7 +551,7 @@ def get(self): # ================================== # Part 2: Build & Send Request (outsourced to maap-py lib) # ================================== - maap = MAAP() + maap = MAAP(maap_api(self.request.host)) try: r = maap.getJobMetrics(params['job_id']) # print(r.status_code) @@ -610,7 +608,7 @@ def get(self): # ================================== # Part 2: Build & Send Request (outsourced to maap-py lib) # ================================== - maap = MAAP() + maap = MAAP(maap_api(self.request.host)) try: r = maap.getJobResult(params['job_id']) # print(r.status_code) @@ -712,7 +710,7 @@ def get(self): # ================================== # Part 2: Build & Send Request (outsourced to maap-py lib) # ================================== - maap = MAAP() + maap = MAAP(maap_api(self.request.host)) try: r = maap.dismissJob(params['job_id']) # print(r.status_code) @@ -784,7 +782,7 @@ def get(self): # ================================== # Part 2: Build & Send Request (outsourced to maap-py lib) # ================================== - maap = MAAP() + maap = MAAP(maap_api(self.request.host)) try: r = maap.deleteJob(params['job_id']) # print(r.status_code) @@ -853,7 +851,7 @@ def get(self): # logging.debug(list(params.values())) # logging.debug(complete) - maap = MAAP() + maap = MAAP(maap_api(self.request.host)) # return all algorithms if malformed request if complete: r = maap.describeAlgorithm('{algo_id}:{version}'.format(**params)) @@ -959,7 +957,7 @@ def get(self): # ================================== # Part 2: Build & Send Request (outsourced to maap-py lib) # ================================== - maap = MAAP() + maap = MAAP(maap_api(self.request.host)) # return all algorithms if malformed request if complete: r = maap.describeAlgorithm('{algo_id}:{version}'.format(**params)) @@ -1084,7 +1082,7 @@ def get(self): vals['run_command'] = params['code_path'] vals['disk_space'] = "10GB" - vals['memory'] = "15GB" + vals['queue'] = "15GB" # default example algo inputs ins = '' @@ -1150,7 +1148,7 @@ def get(self): # ================================== # Part 2: Build & Send Request (outsourced to maap-py lib) # ================================== - maap = MAAP() + maap = MAAP(maap_api(self.request.host)) try: r = maap.listJobs(params['username']) # print(r.status_code) @@ -1227,7 +1225,7 @@ def get(self): class GetQueuesHandler(IPythonHandler): def get(self): - maap = MAAP() + maap = MAAP(maap_api(self.request.host)) r = maap.getQueues() try: resp = json.loads(r.text) diff --git a/submit_jobs/submit_jobs/register.json b/submit_jobs/submit_jobs/register.json index 9bce8d44..5cf63a70 100644 --- a/submit_jobs/submit_jobs/register.json +++ b/submit_jobs/submit_jobs/register.json @@ -7,9 +7,5 @@ "docker_container_url": "{docker_url}", "algorithm_params" : [ {algo_inputs} - {{ - "field": "timestamp", - "download": false - }} ] }} \ No newline at end of file diff --git a/submit_jobs/submit_jobs/register.yaml b/submit_jobs/submit_jobs/register.yaml index c4a47be1..448108a1 100644 --- a/submit_jobs/submit_jobs/register.yaml +++ b/submit_jobs/submit_jobs/register.yaml @@ -6,7 +6,7 @@ environment: {environment} repository_url: {repository_url} docker_url: {docker_url} # queue chosen when registering -memory: {memory} +queue: {queue} # fill out these fields # explain what this algorithm does diff --git a/submit_jobs/submit_jobs/register_url.json b/submit_jobs/submit_jobs/register_url.json index 39febe0f..994cf806 100644 --- a/submit_jobs/submit_jobs/register_url.json +++ b/submit_jobs/submit_jobs/register_url.json @@ -7,12 +7,8 @@ "docker_container_url": "{docker_url}", "repo_url" : "{repository_url}", "disk_space": "10GB", - "memory": "{memory}", + "queue": "{queue}", "algorithm_params" : [ {algo_inputs} - {{ - "field": "timestamp", - "download": false - }} ] }} diff --git a/user_guides/ipycmc.md b/user_guides/ipycmc.md index 1c7eeedb..343dba00 100644 --- a/user_guides/ipycmc.md +++ b/user_guides/ipycmc.md @@ -21,4 +21,117 @@ w.load_layer_config( "http://geoserver.biomass-maap.com/geoserver/gwc/service/wms?REQUEST=GetCapabilities", "wms/xml" ) -``` \ No newline at end of file + +``` +For loading 3D data, see documentation for [load_layer_config](#Class) below. + +## API Documentation + +### Module: `ipycmc` + +`.MapCMC()` - creates a new instance of a MapCMC widget. This widget widget is meant to be dislayed inline in the notebook and functions called on the instance will not perform as expected until the widget is displayed. + +Example +``` +w = ipycmc.MapCMC() +w +``` + +--- + +`.retrieve_data(plotType, startDate, endDate, ds, geometry)` - retrieves data meant for plotting from the MAAP data analysis platform + * `plotType`: _string_ - ['timeseries', 'timeAvgMap', 'hovmoller_lat', 'hovmoller_lon'] the type of plot data to be calculated + * `startDate`: _string_ - ISO 8601 string specifying the start time bound of the analysis + * `endDate`: _string_ - ISO 8601 string specifying the end time bound of the analysis + * `ds`: _list\_ - list of dataset identifiers that should be included in this analysis + * `geometry`: _dict_ - dictionary of type, projection, and coordinates of the area that should be included in the analysis + +Example +``` +plotType = "timeseries" +startDate = "2019-08-25T07:00:00.000Z" +endDate = "2019-09-01T07:00:00.000Z" +ds = ["dataset_id"] +geometry = {"type":"Box","proj":"EPSG:4326","coordinateType":"Cartographic","coordinates":[-180,-90,180,90]} +data = ipycmc.retrieve_data(plotType, startDate, endDate, ds, geometry) +data +``` + +--- + +`.plot_data(plotType, data)` - plots the data from `retrieve_data()` inline in a charting widget + * `plotType`: _string_ - ['timeseries', 'timeAvgMap', 'hovmoller_lat', 'hovmoller_lon'] the type of plot to be generated + * `data`: _dict_ - the output from `retieve_data()` for this plot type + +Example +``` +plotType = "timeseries" +startDate = "2019-08-25T07:00:00.000Z" +endDate = "2019-09-01T07:00:00.000Z" +ds = ["dataset_id"] +geometry = {"type":"Box","proj":"EPSG:4326","coordinateType":"Cartographic","coordinates":[-180,-90,180,90]} +data = ipycmc.retrieve_data(plotType, startDate, endDate, ds, geometry) +ipycmc.plot_data(plotType, data) +``` + +--- + +### Class: `MapCMC` + +NOTE: All methods of the `MapCMC` class expect the widget to be rendered so make sure you display the widget before calling any of these methods beforehand like so + +``` +w = ipycmc.MapCMC() +w +``` + +--- + +`.load_layer_config(url, handle_as, default_ops={})` - load a layer config into the mapping widget + * `url`: _string_ - the url endpoint of the config (such as WMS or WMTS GetCapabilities endpoint) + * `handle_as`: _string_ - ["json", "wmts/xml", "wms/xml"] the type of config endpoint to be loaded + * `default_ops`: _dict_ - a dictionary of default options to apply to the loaded layers. If loading from a GIBS endpoint, you might use: `{"handleAs": "GIBS_raster"}`. + +Example +``` +w.load_layer_config( + "https://gibs.earthdata.nasa.gov/wmts/epsg4326/best/wmts.cgi?SERVICE=WMTS&request=GetCapabilities", + "wmts/xml", + {"handleAs": "GIBS_raster"} +) +``` +When loading 3D layers from MAAP's CMR you must include `"json"` and `{"handleAs": "vector-3d-tile"}`. + +Example +``` +w.load_layer_config( + "https://cmr.maap-project.org/search/concepts/G1200354094-NASA_MAAP.json”, + "json”, + {“handleAs": "vector-3d-tile"}) +``` +--- + +`.set_date(date_str, format_str="")` - set the display date of the mapping widget + * `date_str`: _string_ - date or datetime that the widget should be set to + * `format_str`: _string_ - the format the of the date string. See token formats for MomentJS: https://momentjs.com/docs/#/parsing/string-format/ + +Example +``` +w.set_date("2019-Jan-03", "YYYY-MMM-DD") +``` + +--- + +`.get_date()` - get the current widget display date as an ISO 8601 string + +--- + +`.get_layers()` - get the list of ingested layer information dictionaries + +--- + +`.get_area_selections()` - the the list of selected layer information dictionaries + +--- + +`.get_active_layers()` - the the list of layer information dictionaries for layers currently active on the map diff --git a/user_guides/notebook_magics.md b/user_guides/notebook_magics.md index 9372ead3..e4b21c93 100644 --- a/user_guides/notebook_magics.md +++ b/user_guides/notebook_magics.md @@ -88,11 +88,6 @@ Input Identifier: pass_number DataType: string -Input - Title: timestamp - Identifier: timestamp - DataType: string - Input Title: username Identifier: username