Skip to content

A plugin for Apache Airflow that allows you to edit DAGs in browser

License

Notifications You must be signed in to change notification settings

andreax79/airflow-code-editor

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Airflow Code Editor Plugin

This plugin for Apache Airflow allows you to edit DAGs directly within your browser, providing a seamless and efficient workflow for managing your pipelines. Offering a user-friendly file management interface within designated directories, it facilitates effortless editing, uploading, and downloading of files. With Git support enabled, DAGs are stored in a Git repository, enabling users to explore Git history, review local modifications, and commit changes.

Build Status PyPI version PyPI Downloads Code style: black

System Requirements

  • Airflow Versions
    • 1.10.3 or newer
  • git Versions (git is not required if git support is disabled)
    • 2.0 or newer

Screenshots

File Manager

File Manager

Editor

Editor

Search

Search

Git History

Git History

Git Workspace

Git Workspace

Install Instructions

Docker Images

For the ease of deployment, use the production-ready reference container image. The image is based on the reference images for Apache Airflow.

You can find the following images there:

  • andreax79/airflow-code-editor:latest - the latest released Airflow Code Editor image with the latest Apache Airflow version
  • andreax79/airflow-code-editor:2.10.0 - the latest released Airflow Code Editor with specific Airflow version
  • andreax79/airflow-code-editor:2.10.0-7.7.0 - specific version of Airflow and Airflow Code Editor

Installing from PyPI

  1. Install the plugin
  pip install airflow-code-editor
  1. Install optional dependencies
  • black - Black Python code formatter

  • isort - A Python utility/library to sort imports

  • fs-s3fs - S3FS Amazon S3 Filesystem

  • fs-gcsfs - Google Cloud Storage Filesystem

  • ... other filesystems supported by PyFilesystem - see https://www.pyfilesystem.org/page/index-of-filesystems/

      pip install black isort fs-s3fs fs-gcsfs
  1. Restart the Airflow Web Server

  2. Open Admin - DAGs Code Editor

Config Options

You can set options editing the Airflow's configuration file or setting environment variables. You can edit your airflow.cfg adding any of the following settings in the [code_editor] section. All the settings are optional.

  • enabled enable this plugin (default: True).
  • git_enabled enable git support (default: True). If git is not installed, disable this option.
  • git_cmd git command (path)
  • git_default_args git arguments added to each call (default: -c color.ui=true)
  • git_author_name human-readable name in the author/committer (default logged user first and last names)
  • git_author_email email for the author/committer (default: logged user email)
  • git_init_repo initialize a git repo in DAGs folder (default: True)
  • root_directory root folder (default: Airflow DAGs folder)
  • line_length Python code formatter - max line length (default: 88)
  • string_normalization Python code formatter - if true normalize string quotes and prefixes (default: False)
  • mount, mount1, ... configure additional folder (mount point) - format: name=xxx,path=yyy
  • ignored_entries comma-separated list of entries to be excluded from file/directory list (default: .*,__pycache__)
   [code_editor]
   enabled = True
   git_enabled = True
   git_cmd = /usr/bin/git
   git_default_args = -c color.ui=true
   git_init_repo = False
   root_directory = /home/airflow/dags
   line_length = 88
   string_normalization = False
   mount = name=data,path=/home/airflow/data
   mount1 = name=logs,path=/home/airflow/logs
   mount2 = name=data,path=s3://example

Mount Options:

Example:

  • name=ftp_server,path=ftp://user:[email protected]/private
  • name=data,path=s3://example
  • name=tmp,path=/tmp

You can also set options with the following environment variables:

  • AIRFLOW__CODE_EDITOR__ENABLED
  • AIRFLOW__CODE_EDITOR__GIT_ENABLED
  • AIRFLOW__CODE_EDITOR__GIT_CMD
  • AIRFLOW__CODE_EDITOR__GIT_DEFAULT_ARGS
  • AIRFLOW__CODE_EDITOR__GIT_AUTHOR_NAME
  • AIRFLOW__CODE_EDITOR__GIT_AUTHOR_EMAIL
  • AIRFLOW__CODE_EDITOR__GIT_INIT_REPO
  • AIRFLOW__CODE_EDITOR__ROOT_DIRECTORY
  • AIRFLOW__CODE_EDITOR__LINE_LENGTH
  • AIRFLOW__CODE_EDITOR__STRING_NORMALIZATION
  • AIRFLOW__CODE_EDITOR__MOUNT, AIRFLOW__CODE_EDITOR__MOUNT1, AIRFLOW__CODE_EDITOR__MOUNT2, ...
  • AIRFLOW__CODE_EDITOR__IGNORED_ENTRIES

Example:

   export AIRFLOW__CODE_EDITOR__STRING_NORMALIZATION=True
   export AIRFLOW__CODE_EDITOR__MOUNT='name=data,path=/home/airflow/data'
   export AIRFLOW__CODE_EDITOR__MOUNT1='name=logs,path=/home/airflow/logs'
   export AIRFLOW__CODE_EDITOR__MOUNT2='name=tmp,path=/tmp'

REST API

Airflow Code Editor provides a REST API. Through this API, users can interact with the application programmatically, enabling automation, data retrieval, and integration with other software.

For detailed information on how to use each endpoint, refer to the API documentation.

REST API Authentication

The API authentication is inherited from the Apache Airflow.

If you want to check which auth backend is currently set, you can use airflow config get-value api auth_backends command as in the example below.

$ airflow config get-value api auth_backends
airflow.api.auth.backend.basic_auth

For details on configuring the authentication, see API Authorization.

Development Instructions

  1. Fork the repo

  2. Clone it on the local machine

  git clone https://github.com/andreax79/airflow-code-editor.git
  cd airflow-code-editor
  1. Create dev image
  make dev-image
  1. Switch node version
  nvm use
  1. Make changes you need. Build npm package with:
  make npm-build
  1. You can start Airflow webserver with:
  make webserver
  1. Run tests
  make test
  1. Commit and push changes

  2. Create pull request to the original repo

Links