First, you need to install base Kedro package in
Kedro 17.0 is supported by kedro-airflow-k8s, but not by kedro-mlflow yet, so the latest version from 0.16 family is recommended.
$ pip install 'kedro<0.17'
Install from PyPI
You can install
kedro-airflow-k8s plugin from
pip install --upgrade kedro-airflow-k8s
Install from sources
You may want to install the develop branch which has unreleased features:
pip install git+https://github.com/getindata/kedro-airflow-k8s.git@develop
You can check available commands by going into project directory and runnning:
$ kedro airflow-k8s
Usage: kedro airflow-k8s [OPTIONS] COMMAND [ARGS]...
-e, --env TEXT Environment to use.
-p, --pipeline TEXT Pipeline name to pick.
-h, --help Show this message and exit.
compile Create an Airflow DAG for a project
init Initializes configuration for the plugin
list-pipelines List pipelines generated by this plugin
run-once Uploads pipeline to Airflow and runs once
schedule Uploads pipeline to Airflow with given schedule
ui Open Apache Airflow UI in new browser tab
upload-pipeline Uploads pipeline to Airflow DAG location
compile command takes one argument, which is the directory name containing configuration (relative to
As an outcome,
dag directory contains python file with generated DAG.
init command adds default plugin configuration to the project, based on Apache Airflow CLI input. It also allows
optionally adding github actions, to streamline project build and upload.
list-pipelines lists all pipelines generated by this plugin which exist in Airflow server. All generated DAGs are
tagged with tag
generated_with_kedro_airflow_k8s:$PLUGIN_VERSION and the prefix of this tag is used to distinguish
among the other tags.
run-once command generates DAG from the pipeline, uploads it Airflow DAG location and triggers the DAG run as soon as
the new DAG instance is available. It optionally allows waiting for DAG run completion, checking if
success status is
schedule command takes three arguments, one is the directory name containing configuration (relative to
folder), the second one is the output location of generated dag, the third is cron like expression that relates to
ui simplifies access to Apache Airflow console. It also allows open UI for the specific DAG.
upload-pipeline command takes two arguments, one is the directory name containing configuration (relative to
folder), the second one is the output location of generated dag.