Apache Airflow标准软件基于Bitnami airflow-scheduler 构建。当前版本为2.4.58

你可以通过轻云UC部署工具直接安装部署,也可以手动按如下文档操作,该项目已经全面开源,可以从如下环境获取 配置文件地址: qingcloud-platform: 一站式、开箱即用、可扩展的组件化软件工厂!高效易用 低代码 组件化 软件开发设计器。助力中小微企业低成本快速实现数字化转型,提高开发人员工作效率。

qinghub自动安装部署配置库

What is Apache Airflow Scheduler?

Apache Airflow 是一种以有向无环图 (DAG) 形式表达和执行工作流程的工具。Airflow scheduler触发任务并提供监控任务进度的工具。

快速运行

docker run --name airflow-scheduler bitnami/airflow-scheduler:latest

您可以在环境变量部分找到默认凭据和可用的配置选项。

先决条件

要运行此应用程序,您需要Docker Engine >= 1.10.0。建议使用Docker Compose1.6.0版本或更高版本。

使用

Apache Airflow Scheduler 是使用CeleryExecutor. 因此,您将需要其余的 Airflow 组件才能使该图像正常工作。您将需要一台Airflow Web 服务器、一个或多个Airflow Workers、一个PostgreSQL 数据库和一台Redis(R) 服务器。

使用 Docker 命令行

  1. 创建网络docker network create airflow-tier
  2. 创建用于 PostgreSQL 持久化的卷并创建 PostgreSQL 容器
docker volume create --name postgresql_data
docker run -d --name postgresql \
  -e POSTGRESQL_USERNAME=bn_airflow \
  -e POSTGRESQL_PASSWORD=bitnami1 \
  -e POSTGRESQL_DATABASE=bitnami_airflow \
  --net airflow-tier \
  --volume postgresql_data:/bitnami/postgresql \
  bitnami/postgresql:latest
  1. 创建 Redis(R) 持久性卷并创建 Redis(R) 容器
docker volume create --name redis_data
docker run -d --name redis \
  -e ALLOW_EMPTY_PASSWORD=yes \
  --net airflow-tier \
  --volume redis_data:/bitnami \
  bitnami/redis:latest
  1. 启动 Apache Airflow Scheduler Web 容器
docker run -d --name airflow -p 8080:8080 \
  -e AIRFLOW_FERNET_KEY=46BKJoQYlPPOexq0OhDZnIlNepKFf87WFwLbfzqDDho= \
  -e AIRFLOW_SECRET_KEY=a25mQ1FHTUh3MnFRSk5KMEIyVVU2YmN0VGRyYTVXY08= \
  -e AIRFLOW_EXECUTOR=CeleryExecutor \
  -e AIRFLOW_DATABASE_NAME=bitnami_airflow \
  -e AIRFLOW_DATABASE_USERNAME=bn_airflow \
  -e AIRFLOW_DATABASE_PASSWORD=bitnami1 \
  -e AIRFLOW_LOAD_EXAMPLES=yes \
  -e AIRFLOW_PASSWORD=bitnami123 \
  -e AIRFLOW_USERNAME=user \
  -e AIRFLOW_EMAIL=user@example.com \
  --net airflow-tier \
  bitnami/airflow:latest
  1. 启动 Apache Airflow Scheduler 调度程序容器
docker run -d --name airflow-scheduler \
  -e AIRFLOW_FERNET_KEY=46BKJoQYlPPOexq0OhDZnIlNepKFf87WFwLbfzqDDho= \
  -e AIRFLOW_SECRET_KEY=a25mQ1FHTUh3MnFRSk5KMEIyVVU2YmN0VGRyYTVXY08= \
  -e AIRFLOW_EXECUTOR=CeleryExecutor \
  -e AIRFLOW_DATABASE_NAME=bitnami_airflow \
  -e AIRFLOW_DATABASE_USERNAME=bn_airflow \
  -e AIRFLOW_DATABASE_PASSWORD=bitnami1 \
  -e AIRFLOW_LOAD_EXAMPLES=yes \
  --net airflow-tier \
  bitnami/airflow-scheduler:latest
  1. 启动 Apache Airflow Scheduler 工作容器
docker run -d --name airflow-worker \
  -e AIRFLOW_FERNET_KEY=46BKJoQYlPPOexq0OhDZnIlNepKFf87WFwLbfzqDDho= \
  -e AIRFLOW_SECRET_KEY=a25mQ1FHTUh3MnFRSk5KMEIyVVU2YmN0VGRyYTVXY08= \
  -e AIRFLOW_EXECUTOR=CeleryExecutor \
  -e AIRFLOW_DATABASE_NAME=bitnami_airflow \
  -e AIRFLOW_DATABASE_USERNAME=bn_airflow \
  -e AIRFLOW_DATABASE_PASSWORD=bitnami1 \
  --net airflow-tier \
  bitnami/airflow-worker:latest

访问 : http://your-ip:8080

持久化应用

Airflow 容器依赖 PostgreSQL 数据库和 Redis 来保存数据。这意味着 Airflow 不会保留任何东西。为了避免数据丢失,您应该安装卷以持久保存PostgreSQL 数据和Redis(R) 数据

上面的示例定义了 docker 卷postgresql_data,即 、 和redis_data。只要不删除这些卷,Airflow 应用程序状态就会持续存在。

为了避免无意中删除这些卷,您可以将主机目录安装为数据卷。或者,您可以使用卷插件来托管卷数据。

使用 Docker Compose 将主机目录挂载为数据卷

以下docker-compose.yml模板演示了如何使用主机目录作为数据卷。

version: '2'
services:
  postgresql:
    image: 'bitnami/postgresql:latest'
    environment:
      - POSTGRESQL_DATABASE=bitnami_airflow
      - POSTGRESQL_USERNAME=bn_airflow
      - POSTGRESQL_PASSWORD=bitnami1
    volumes:
      - /path/to/postgresql-persistence:/bitnami
  redis:
    image: 'bitnami/redis:latest'
    environment:
      - ALLOW_EMPTY_PASSWORD=yes
    volumes:
      - /path/to/redis-persistence:/bitnami
  airflow-worker:
    image: bitnami/airflow-worker:latest
    environment:
      - AIRFLOW_FERNET_KEY=46BKJoQYlPPOexq0OhDZnIlNepKFf87WFwLbfzqDDho=
      - AIRFLOW_SECRET_KEY=a25mQ1FHTUh3MnFRSk5KMEIyVVU2YmN0VGRyYTVXY08=
      - AIRFLOW_EXECUTOR=CeleryExecutor
      - AIRFLOW_DATABASE_NAME=bitnami_airflow
      - AIRFLOW_DATABASE_USERNAME=bn_airflow
      - AIRFLOW_DATABASE_PASSWORD=bitnami1
      - AIRFLOW_LOAD_EXAMPLES=yes
  airflow-scheduler:
    image: bitnami/airflow-scheduler:latest
    environment:
      - AIRFLOW_FERNET_KEY=46BKJoQYlPPOexq0OhDZnIlNepKFf87WFwLbfzqDDho=
      - AIRFLOW_SECRET_KEY=a25mQ1FHTUh3MnFRSk5KMEIyVVU2YmN0VGRyYTVXY08=
      - AIRFLOW_EXECUTOR=CeleryExecutor
      - AIRFLOW_DATABASE_NAME=bitnami_airflow
      - AIRFLOW_DATABASE_USERNAME=bn_airflow
      - AIRFLOW_DATABASE_PASSWORD=bitnami1
      - AIRFLOW_LOAD_EXAMPLES=yes
  airflow:
    image: bitnami/airflow:latest
    environment:
      - AIRFLOW_FERNET_KEY=46BKJoQYlPPOexq0OhDZnIlNepKFf87WFwLbfzqDDho=
      - AIRFLOW_SECRET_KEY=a25mQ1FHTUh3MnFRSk5KMEIyVVU2YmN0VGRyYTVXY08=
      - AIRFLOW_EXECUTOR=CeleryExecutor
      - AIRFLOW_DATABASE_NAME=bitnami_airflow
      - AIRFLOW_DATABASE_USERNAME=bn_airflow
      - AIRFLOW_DATABASE_PASSWORD=bitnami1
      - AIRFLOW_PASSWORD=bitnami123
      - AIRFLOW_USERNAME=user
      - AIRFLOW_EMAIL=user@example.com
    ports:
      - '8080:8080'
使用 Docker 命令行将主机目录挂载为数据卷
  1. 创建网络(如果不存在)docker network create airflow-tier
  2. 使用主机卷创建 PostgreSQL 容器
docker run -d --name postgresql \
  -e POSTGRESQL_USERNAME=bn_airflow \
  -e POSTGRESQL_PASSWORD=bitnami1 \
  -e POSTGRESQL_DATABASE=bitnami_airflow \
  --net airflow-tier \
  --volume /path/to/postgresql-persistence:/bitnami \
  bitnami/postgresql:latest
  1. 使用主机卷创建 Redis(R) 容器
docker run -d --name redis \
  -e ALLOW_EMPTY_PASSWORD=yes \
  --net airflow-tier \
  --volume /path/to/redis-persistence:/bitnami \
  bitnami/redis:latest
  1. 创建 Airflow 容器
docker run -d --name airflow -p 8080:8080 \
  -e AIRFLOW_FERNET_KEY=46BKJoQYlPPOexq0OhDZnIlNepKFf87WFwLbfzqDDho= \
  -e AIRFLOW_SECRET_KEY=a25mQ1FHTUh3MnFRSk5KMEIyVVU2YmN0VGRyYTVXY08= \
  -e AIRFLOW_EXECUTOR=CeleryExecutor \
  -e AIRFLOW_DATABASE_NAME=bitnami_airflow \
  -e AIRFLOW_DATABASE_USERNAME=bn_airflow \
  -e AIRFLOW_DATABASE_PASSWORD=bitnami1 \
  -e AIRFLOW_LOAD_EXAMPLES=yes \
  -e AIRFLOW_PASSWORD=bitnami123 \
  -e AIRFLOW_USERNAME=user \
  -e AIRFLOW_EMAIL=user@example.com \
  --net airflow-tier \
  bitnami/airflow:latest
  1. 创建 Apache Airflow Scheduler 容器
docker run -d --name airflow-scheduler \
  -e AIRFLOW_FERNET_KEY=46BKJoQYlPPOexq0OhDZnIlNepKFf87WFwLbfzqDDho= \
  -e AIRFLOW_SECRET_KEY=a25mQ1FHTUh3MnFRSk5KMEIyVVU2YmN0VGRyYTVXY08= \
  -e AIRFLOW_EXECUTOR=CeleryExecutor \
  -e AIRFLOW_DATABASE_NAME=bitnami_airflow \
  -e AIRFLOW_DATABASE_USERNAME=bn_airflow \
  -e AIRFLOW_DATABASE_PASSWORD=bitnami1 \
  -e AIRFLOW_LOAD_EXAMPLES=yes \
  --net airflow-tier \
  bitnami/airflow-scheduler:latest
  1. 创建 Airflow Worker 容器
docker run -d --name airflow-worker \
  -e AIRFLOW_FERNET_KEY=46BKJoQYlPPOexq0OhDZnIlNepKFf87WFwLbfzqDDho= \
  -e AIRFLOW_SECRET_KEY=a25mQ1FHTUh3MnFRSk5KMEIyVVU2YmN0VGRyYTVXY08= \
  -e AIRFLOW_EXECUTOR=CeleryExecutor \
  -e AIRFLOW_DATABASE_NAME=bitnami_airflow \
  -e AIRFLOW_DATABASE_USERNAME=bn_airflow \
  -e AIRFLOW_DATABASE_PASSWORD=bitnami1 \
  --net airflow-tier \
  bitnami/airflow-worker:latest

配置

安装额外的 python 模块

该容器支持在启动时安装额外的 python 模块。为此,您可以requirements.txt根据您的特定需求在路径下挂载一个文件/bitnami/python/requirements.txt。

环境变量

可定制的环境变量

Name

Description

Default Value

AIRFLOW_EXECUTOR

Airflow executor.

SequentialExecutor

AIRFLOW_EXECUTOR

Airflow executor.

CeleryExecutor

AIRFLOW_FORCE_OVERWRITE_CONF_FILE

Force the airflow.cfg config file generation.

no

AIRFLOW_WEBSERVER_HOST

Airflow webserver host

127.0.0.1

AIRFLOW_WEBSERVER_PORT_NUMBER

Airflow webserver port.

8080

AIRFLOW_LOAD_EXAMPLES

To load example tasks into the application.

yes

AIRFLOW_HOSTNAME_CALLABLE

Method to obtain the hostname.

socket.gethostname

AIRFLOW_DATABASE_HOST

Hostname for PostgreSQL server.

postgresql

AIRFLOW_DATABASE_HOST

Hostname for PostgreSQL server.

127.0.0.1

AIRFLOW_DATABASE_PORT_NUMBER

Port used by PostgreSQL server.

5432

AIRFLOW_DATABASE_NAME

Database name that Airflow will use to connect with the database.

bitnami_airflow

AIRFLOW_DATABASE_USERNAME

Database user that Airflow will use to connect with the database.

bn_airflow

AIRFLOW_DATABASE_USE_SSL

Set to yes if the database is using SSL.

no

AIRFLOW_REDIS_USE_SSL

Set to yes if Redis(R) uses SSL.

no

REDIS_HOST

Hostname for Redis(R) server.

redis

REDIS_HOST

Hostname for Redis(R) server.

127.0.0.1

REDIS_PORT_NUMBER

Port used by Redis(R) server.

6379

REDIS_DATABASE

Name of the Redis(R) database.

1

只读环境变量

Name

Description

Value

AIRFLOW_BASE_DIR

Airflow installation directory.

${BITNAMI_ROOT_DIR}/airflow

AIRFLOW_HOME

Airflow home directory.

${AIRFLOW_BASE_DIR}

AIRFLOW_BIN_DIR

Airflow directory for binary executables.

${AIRFLOW_BASE_DIR}/venv/bin

AIRFLOW_LOGS_DIR

Airflow logs directory.

${AIRFLOW_BASE_DIR}/logs

AIRFLOW_SCHEDULER_LOGS_DIR

Airflow scheduler logs directory.

${AIRFLOW_LOGS_DIR}/scheduler

AIRFLOW_LOG_FILE

Airflow logs file.

${AIRFLOW_LOGS_DIR}/airflow-scheduler.log

AIRFLOW_CONF_FILE

Airflow configuration file.

${AIRFLOW_BASE_DIR}/airflow.cfg

AIRFLOW_TMP_DIR

Airflow directory temporary files.

${AIRFLOW_BASE_DIR}/tmp

AIRFLOW_PID_FILE

Path to the Airflow PID file.

${AIRFLOW_TMP_DIR}/airflow-scheduler.pid

AIRFLOW_DAGS_DIR

Airflow data to be persisted.

${AIRFLOW_BASE_DIR}/dags

AIRFLOW_DAEMON_USER

Airflow system user.

airflow

AIRFLOW_DAEMON_GROUP

Airflow system group.

airflow

除了前面的环境变量之外,配置文件中的所有参数都可以使用以下格式的环境变量覆盖:AIRFLOW__{SECTION}__{KEY}. 注意双下划线。

使用 Docker Compose 指定环境变量
version: '2'

services:
  airflow:
    image: bitnami/airflow:1
    environment:
      - AIRFLOW_FERNET_KEY=46BKJoQYlPPOexq0OhDZnIlNepKFf87WFwLbfzqDDho=
      - AIRFLOW_SECRET_KEY=a25mQ1FHTUh3MnFRSk5KMEIyVVU2YmN0VGRyYTVXY08=
      - AIRFLOW_EXECUTOR=CeleryExecutor
      - AIRFLOW_DATABASE_NAME=bitnami_airflow
      - AIRFLOW_DATABASE_USERNAME=bn_airflow
      - AIRFLOW_DATABASE_PASSWORD=bitnami1
      - AIRFLOW_PASSWORD=bitnami123
      - AIRFLOW_USERNAME=user
      - AIRFLOW_EMAIL=user@example.com
在 Docker 命令行上指定环境变量
docker run -d --name airflow -p 8080:8080 \
    -e AIRFLOW_FERNET_KEY=46BKJoQYlPPOexq0OhDZnIlNepKFf87WFwLbfzqDDho= \
    -e AIRFLOW_SECRET_KEY=a25mQ1FHTUh3MnFRSk5KMEIyVVU2YmN0VGRyYTVXY08= \
    -e AIRFLOW_EXECUTOR=CeleryExecutor \
    -e AIRFLOW_DATABASE_NAME=bitnami_airflow \
    -e AIRFLOW_DATABASE_USERNAME=bn_airflow \
    -e AIRFLOW_DATABASE_PASSWORD=bitnami1 \
    -e AIRFLOW_PASSWORD=bitnami123 \
    -e AIRFLOW_USERNAME=user \
    -e AIRFLOW_EMAIL=user@example.com \
    --volume airflow_data:/bitnami \
    bitnami/airflow:latest
Logo

一起探索未来云端世界的核心,云原生技术专区带您领略创新、高效和可扩展的云计算解决方案,引领您在数字化时代的成功之路。

更多推荐