Apache Airflow Provider

The apache-airflow-providers-akeyless package integrates the Akeyless identity security platform with Apache Airflow. It lets you fetch secrets, manage credentials, and use Akeyless as a native Airflow Secrets Backend.

The provider is maintained in the apache/airflow repository.

Before you begin

  • You have an Akeyless account with at least one Authentication Method configured.
  • If using api_key authentication, you have your Access ID and Access Key ready.
  • If using a cloud-based authentication method (AWS IAM, GCP, or Azure AD), install the cloud_id extras package (see Installation).
  • Apache Airflow 2.11.0 or later is installed.
CapabilityClassDescription
Hookairflow.providers.akeyless.hooks.akeyless.AkeylessHookInteract with Akeyless directly from Directed Acyclic Graph (DAG) code — fetch static, dynamic, and rotated secrets; create, update, or delete items; list paths.
Connection typeakeylessAirflow connection type identifier. Create a connection with this type in the Airflow UI or environment to supply credentials to the hook.
Secrets Backendairflow.providers.akeyless.secrets.akeyless.AkeylessBackendTransparently resolve Airflow Connections, Variables, and Config from Akeyless — no DAG code changes required. Supports api_key and uid authentication only.

Requirements

RequirementMinimum version
Python3.10
apache-airflow2.11.0
akeyless5.0.0

Installation

Install the base package:

pip install apache-airflow-providers-akeyless

For cloud-based authentication (AWS IAM, GCP, Azure AD) also install the cloud ID extras:

pip install apache-airflow-providers-akeyless[cloud_id]

Authentication Methods

The provider supports the following Akeyless Authentication Methods:

access_typeRequired fieldsSupported by
api_key defaultaccess_id, access_keyHook, Secrets Backend
aws_iamaccess_id + cloud_id extras packageHook only
gcpaccess_id + cloud_id extras package; optional: gcp_audienceHook only
azure_adaccess_id + cloud_id extras package; optional: azure_object_idHook only
uiduid_tokenHook, Secrets Backend
jwtaccess_id, jwtHook only
k8saccess_id, k8s_auth_config_nameHook only
certificateaccess_id, certificate_data, private_key_dataHook only
⚠️

Unsupported authentication methods: The following Akeyless authentication methods are not supported by this provider: OCI IAM, Kerberos, LDAP, SAML, OIDC, and Email.

Usage

Airflow Connection (Hook)

Create an Airflow Connection with Connection Type = akeyless.

Via the Airflow UI

In the Airflow UI connection form, the following fields are available:

UI fieldValue
API URLhttps://api.akeyless.io (or your Gateway URL)
Access IDYour Akeyless Access ID
Access KeyYour Akeyless Access Key (for api_key authentication; leave blank for other types)
Access typeOne of the access_type values from Authentication Methods (default: api_key)

The form also shows dedicated fields for each authentication-method-specific parameter: UID Token, JWT, K8s Auth Config Name, Certificate Data (PEM), Private Key Data (PEM), GCP Audience, and Azure Object ID. The raw Extra, Schema, and Port fields are hidden.

Via environment variable or CLI

When defining connections outside the UI (for example, with AIRFLOW_CONN_* environment variables), provide the extra field as a JSON object:

access_typeextra JSON
api_key (default){"access_type": "api_key"}
uid{"access_type": "uid", "uid_token": "<UID token>"}login and password are unused
jwt{"access_type": "jwt", "jwt": "<JWT>"}
k8s{"access_type": "k8s", "k8s_auth_config_name": "<config name>"}
aws_iam{"access_type": "aws_iam"} — cloud identity resolved automatically
gcp{"access_type": "gcp"} or {"access_type": "gcp", "gcp_audience": "<audience>"}
azure_ad{"access_type": "azure_ad"} or {"access_type": "azure_ad", "azure_object_id": "<object ID>"}
certificate{"access_type": "certificate", "certificate_data": "<PEM>", "private_key_data": "<PEM>"}

Then use the hook in a DAG:

from airflow.providers.akeyless.hooks.akeyless import AkeylessHook

hook = AkeylessHook(akeyless_conn_id="akeyless_default")

Fetching secrets

# Static secret
value = hook.get_secret_value("/my/secret")

# Multiple static secrets at once
values = hook.get_secret_values(["/secret/a", "/secret/b"])

# Dynamic secret (for example, a database credentials producer)
creds = hook.get_dynamic_secret_value("/dynamic/db-producer")
username, password = creds["username"], creds["password"]

# Rotated secret
rotated = hook.get_rotated_secret_value("/rotated/db-creds")

Managing secrets

# Create a static secret
hook.create_secret("/new/secret", "my-value", description="Created by Airflow")

# Update a static secret's value
hook.update_secret_value("/new/secret", "updated-value")

# List items under a path
items = hook.list_items("/path/prefix")

# Describe an item (returns metadata)
meta = hook.describe_item("/my/secret")

Secrets Backend

Configure Airflow to fetch Connections, Variables, and Config directly from Akeyless.

Add to airflow.cfg:

[secrets]
backend = airflow.providers.akeyless.secrets.akeyless.AkeylessBackend
backend_kwargs = {
    "connections_path": "/airflow/connections",
    "variables_path": "/airflow/variables",
    "config_path": "/airflow/config",
    "api_url": "https://api.akeyless.io",
    "access_id": "<Access ID>",
    "access_key": "<Access Key>",
    "access_type": "api_key"
    }
ℹ️

In airflow.cfg, multi-line backend_kwargs values must have each continuation line indented with at least one space. Alternatively, provide the value as a single-line JSON string.

For uid authentication, omit access_key and include uid_token in backend_kwargs instead:

[secrets]
backend = airflow.providers.akeyless.secrets.akeyless.AkeylessBackend
backend_kwargs = {
    "connections_path": "/airflow/connections",
    "variables_path": "/airflow/variables",
    "config_path": "/airflow/config",
    "api_url": "https://api.akeyless.io",
    "access_id": "<Access ID>",
    "access_type": "uid",
    "uid_token": "<UID token>"
    }

Or with environment variables:

export AIRFLOW__SECRETS__BACKEND="airflow.providers.akeyless.secrets.akeyless.AkeylessBackend"
export AIRFLOW__SECRETS__BACKEND_KWARGS='{"connections_path": "/airflow/connections", "variables_path": "/airflow/variables", "config_path": "/airflow/config", "api_url": "https://api.akeyless.io", "access_id": "<Access ID>", "access_key": "<Access Key>", "access_type": "api_key"}'

Naming Convention

Secrets are looked up by joining <base_path>/<key>:

TypeExample lookup path
Connection postgres_default/airflow/connections/postgres_default
Variable my_var/airflow/variables/my_var
Config smtp_host/airflow/config/smtp_host

Storing Connections in Akeyless

Store the connection secret value in one of these formats:

URI format:

postgresql://user:password@host:5432/dbname

JSON format:

{
  "conn_type": "postgres",
  "host": "db.example.com",
  "login": "admin",
  "password": "secret",
  "schema": "mydb",
  "port": 5432
}

JSON with conn_uri:

{
  "conn_uri": "postgresql://user:password@host:5432/dbname"
}

Cloud-Based Authentication

For AWS IAM, GCP, or Azure AD, omit access_key and set the appropriate access_type. The provider uses the workload's cloud identity automatically.

⚠️

Secrets Backend limitation: AkeylessBackend only supports api_key and uid authentication. For cloud-based authentication (AWS IAM, GCP, Azure AD) use AkeylessHook directly in your DAGs.

Example using AWS IAM with the hook:

from airflow.providers.akeyless.hooks.akeyless import AkeylessHook

hook = AkeylessHook(akeyless_conn_id="akeyless_aws_iam")
value = hook.get_secret_value("/my/secret")

Set the connection access_type extra field to aws_iam and install the cloud_id extras. The hook authenticates using the workload's AWS IAM identity (EC2 instance profile, ECS task role, and so on) — no static credentials required.

Troubleshooting

ImportError: akeyless_cloud_id is required

You are using aws_iam, gcp, or azure_ad authentication without the cloud ID extras package. Install it:

pip install apache-airflow-providers-akeyless[cloud_id]

ValueError: Unsupported access_type for AkeylessBackend

AkeylessBackend only supports api_key and uid. For cloud-based authentication in the Secrets Backend, use AkeylessHook directly in your DAGs instead.

Secret not found when using Secrets Backend

Verify that the secret path in Akeyless matches the expected naming convention: <base_path>/<key>. For example, a Connection with conn_id = postgres_default is looked up at <connections_path>/postgres_default. Confirm the path and value are present in Akeyless, then restart Airflow for the configuration to take effect.

Authentication fails with 401 Unauthorized

  • For api_key: confirm the Access ID and Access Key fields in the connection are correct.
  • For uid: confirm the uid_token value is valid and not expired.
  • For cloud-based authentication methods: confirm the workload has the expected IAM role or service account attached.

Footer Section