Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Airflow 2.8.0 blog post #906

Merged
merged 4 commits into from
Dec 18, 2023
Merged
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
85 changes: 85 additions & 0 deletions landing-pages/site/content/en/blog/airflow-2.8.0/index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,85 @@
---
title: "Apache Airflow 2.8.0 is here"
linkTitle: "Apache Airflow 2.8.0 is here"
author: "Ephraim Anierobi"
github: "ephraimbuddy"
linkedin: "ephraimanierobi"
description: "Introducing Apache Airflow 2.8.0: Enhanced with New Features and Significant Improvements"
tags: [Release]
date: "2023-12-14"
ephraimbuddy marked this conversation as resolved.
Show resolved Hide resolved
---


**Details**:

📦 PyPI: https://pypi.org/project/apache-airflow/2.8.0/ \
📚 Docs: https://airflow.apache.org/docs/apache-airflow/2.8.0/ \
🛠 Release Notes: https://airflow.apache.org/docs/apache-airflow/2.8.0/release_notes.html \
🐳 Docker Image: "docker pull apache/airflow:2.8.0" \
🚏 Constraints: https://github.com/apache/airflow/tree/constraints-2.8.0

## Airflow Object Storage (AIP-58)

*This feature is experimental and subject to change.*

Airflow now offers a generic abstraction layer over various object stores like S3, GCS, and Azure Blob Storage, enabling the use of different storage systems in DAGs without code modification.

In addition, it allows you to use most of the standard Python modules, like shutil, that can work with file-like objects.

Here is an example of how to use the new feature to open a file:

```python
from airflow.io.path import ObjectStoragePath

base = ObjectStoragePath("s3://my-bucket/", conn_id="aws_default") # conn_id is optional

@task
def read_file(path: ObjectStoragePath) -> str:
with path.open() as f:
return f.read()
```

The above example is just the tip of the iceberg. The new feature allows you to configure an alternative backend for a scheme or protocol.

Here is an example of how to configure a custom backend for the `dbfs` scheme:

```python
from airflow.io.path import ObjectStoragePath
from airflow.io.store import attach

from fsspec.implementations.dbfs import DBFSFileSystem

attach(protocol="dbfs", fs=DBFSFileSystem(instance="myinstance", token="mytoken"))
base = ObjectStoragePath("dbfs://my-location/")
```

For more information: [Airflow Object Storage](https://airflow.apache.org/docs/apache-airflow/stable/core-concepts/objectstorage.html)

The support for a specific object storage system depends on the installed providers,
with out-of-the-box support for the file scheme.

## Ship logs from other components to Task logs
This feature seamlessly integrates task-related messages from various Airflow components, including the Scheduler and
Executors, into the task logs. This integration allows users to easily track error messages and other relevant
information within a single log view.

Presently, suppose a task is terminated by the scheduler before initiation, times out due to prolonged queuing, or transitions into a zombie state. In that case, it is not recorded in the task log. With this enhancement, in such situations,
it becomes feasible to dispatch an error message to the task log for convenient visibility on the UI.

This feature can be toggled, for more information [Look for “enable_task_context_logger” in the logging configuration documentation](https://airflow.apache.org/docs/apache-airflow/stable/configurations-ref.html#logging)

## Listener hooks for Datasets
ephraimbuddy marked this conversation as resolved.
Show resolved Hide resolved
This feature enables users to subscribe to Dataset creation and update events using listener hooks.
It’s particularly useful to trigger external processes based on a Dataset being created or updated.
ephraimbuddy marked this conversation as resolved.
Show resolved Hide resolved

## Using Extra Index URLs with PythonVirtualEnvOperator and Caching
This feature allows you to specify extra index URLs to PythonVirtualEnvOperator (+corresponding decorator) to be able to install virtualenvs with (private) additional Python package repositories.

You can also reuse the virtualenvs by caching them in a specified directory and reusing them in subsequent runs. This
can be achieved by setting the ``venv_cache_path`` to a file system folder on your worker

For more information: [PythonVirtualenvOperator](https://airflow.apache.org/docs/apache-airflow/stable/howto/operator/python.html#pythonvirtualenvoperator)

Additional new features and improvements can be found in the [Airflow 2.8.0 release notes](https://airflow.apache.org/docs/apache-airflow/2.8.0/release_notes.html#airflow-2-8-0-2023-12-14).

# Contributors
Loading