Apache Airflow Provider
The apache-airflow-providers-akeyless package integrates the Akeyless identity security platform with Apache Airflow. It lets you fetch secrets, manage credentials, and use Akeyless as a native Airflow Secrets Backend.
The provider is maintained in the apache/airflow repository.
Before you begin
- You have an Akeyless account with at least one Authentication Method configured.
- If using
api_keyauthentication, you have your Access ID and Access Key ready. - If using a cloud-based authentication method (AWS IAM, GCP, or Azure AD), install the
cloud_idextras package (see Installation). - Apache Airflow 2.11.0 or later is installed.
| Capability | Class | Description |
|---|---|---|
| Hook | airflow.providers.akeyless.hooks.akeyless.AkeylessHook | Interact with Akeyless directly from Directed Acyclic Graph (DAG) code — fetch static, dynamic, and rotated secrets; create, update, or delete items; list paths. |
| Connection type | akeyless | Airflow connection type identifier. Create a connection with this type in the Airflow UI or environment to supply credentials to the hook. |
| Secrets Backend | airflow.providers.akeyless.secrets.akeyless.AkeylessBackend | Transparently resolve Airflow Connections, Variables, and Config from Akeyless — no DAG code changes required. Supports api_key and uid authentication only. |
Requirements
| Requirement | Minimum version |
|---|---|
| Python | 3.10 |
apache-airflow | 2.11.0 |
akeyless | 5.0.0 |
Installation
Install the base package:
pip install apache-airflow-providers-akeylessFor cloud-based authentication (AWS IAM, GCP, Azure AD) also install the cloud ID extras:
pip install apache-airflow-providers-akeyless[cloud_id]Authentication Methods
The provider supports the following Akeyless Authentication Methods:
access_type | Required fields | Supported by |
|---|---|---|
api_key default | access_id, access_key | Hook, Secrets Backend |
aws_iam | access_id + cloud_id extras package | Hook only |
gcp | access_id + cloud_id extras package; optional: gcp_audience | Hook only |
azure_ad | access_id + cloud_id extras package; optional: azure_object_id | Hook only |
uid | uid_token | Hook, Secrets Backend |
jwt | access_id, jwt | Hook only |
k8s | access_id, k8s_auth_config_name | Hook only |
certificate | access_id, certificate_data, private_key_data | Hook only |
Unsupported authentication methods: The following Akeyless authentication methods are not supported by this provider: OCI IAM, Kerberos, LDAP, SAML, OIDC, and Email.
Usage
Airflow Connection (Hook)
Create an Airflow Connection with Connection Type = akeyless.
Via the Airflow UI
In the Airflow UI connection form, the following fields are available:
| UI field | Value |
|---|---|
| API URL | https://api.akeyless.io (or your Gateway URL) |
| Access ID | Your Akeyless Access ID |
| Access Key | Your Akeyless Access Key (for api_key authentication; leave blank for other types) |
| Access type | One of the access_type values from Authentication Methods (default: api_key) |
The form also shows dedicated fields for each authentication-method-specific parameter: UID Token, JWT, K8s Auth Config Name, Certificate Data (PEM), Private Key Data (PEM), GCP Audience, and Azure Object ID. The raw Extra, Schema, and Port fields are hidden.
Via environment variable or CLI
When defining connections outside the UI (for example, with AIRFLOW_CONN_* environment variables), provide the extra field as a JSON object:
access_type | extra JSON |
|---|---|
api_key (default) | {"access_type": "api_key"} |
uid | {"access_type": "uid", "uid_token": "<UID token>"} — login and password are unused |
jwt | {"access_type": "jwt", "jwt": "<JWT>"} |
k8s | {"access_type": "k8s", "k8s_auth_config_name": "<config name>"} |
aws_iam | {"access_type": "aws_iam"} — cloud identity resolved automatically |
gcp | {"access_type": "gcp"} or {"access_type": "gcp", "gcp_audience": "<audience>"} |
azure_ad | {"access_type": "azure_ad"} or {"access_type": "azure_ad", "azure_object_id": "<object ID>"} |
certificate | {"access_type": "certificate", "certificate_data": "<PEM>", "private_key_data": "<PEM>"} |
Then use the hook in a DAG:
from airflow.providers.akeyless.hooks.akeyless import AkeylessHook
hook = AkeylessHook(akeyless_conn_id="akeyless_default")Fetching secrets
# Static secret
value = hook.get_secret_value("/my/secret")
# Multiple static secrets at once
values = hook.get_secret_values(["/secret/a", "/secret/b"])
# Dynamic secret (for example, a database credentials producer)
creds = hook.get_dynamic_secret_value("/dynamic/db-producer")
username, password = creds["username"], creds["password"]
# Rotated secret
rotated = hook.get_rotated_secret_value("/rotated/db-creds")Managing secrets
# Create a static secret
hook.create_secret("/new/secret", "my-value", description="Created by Airflow")
# Update a static secret's value
hook.update_secret_value("/new/secret", "updated-value")
# List items under a path
items = hook.list_items("/path/prefix")
# Describe an item (returns metadata)
meta = hook.describe_item("/my/secret")Secrets Backend
Configure Airflow to fetch Connections, Variables, and Config directly from Akeyless.
Add to airflow.cfg:
[secrets]
backend = airflow.providers.akeyless.secrets.akeyless.AkeylessBackend
backend_kwargs = {
"connections_path": "/airflow/connections",
"variables_path": "/airflow/variables",
"config_path": "/airflow/config",
"api_url": "https://api.akeyless.io",
"access_id": "<Access ID>",
"access_key": "<Access Key>",
"access_type": "api_key"
}
Inairflow.cfg, multi-linebackend_kwargsvalues must have each continuation line indented with at least one space. Alternatively, provide the value as a single-line JSON string.
For uid authentication, omit access_key and include uid_token in backend_kwargs instead:
[secrets]
backend = airflow.providers.akeyless.secrets.akeyless.AkeylessBackend
backend_kwargs = {
"connections_path": "/airflow/connections",
"variables_path": "/airflow/variables",
"config_path": "/airflow/config",
"api_url": "https://api.akeyless.io",
"access_id": "<Access ID>",
"access_type": "uid",
"uid_token": "<UID token>"
}Or with environment variables:
export AIRFLOW__SECRETS__BACKEND="airflow.providers.akeyless.secrets.akeyless.AkeylessBackend"
export AIRFLOW__SECRETS__BACKEND_KWARGS='{"connections_path": "/airflow/connections", "variables_path": "/airflow/variables", "config_path": "/airflow/config", "api_url": "https://api.akeyless.io", "access_id": "<Access ID>", "access_key": "<Access Key>", "access_type": "api_key"}'Naming Convention
Secrets are looked up by joining <base_path>/<key>:
| Type | Example lookup path |
|---|---|
Connection postgres_default | /airflow/connections/postgres_default |
Variable my_var | /airflow/variables/my_var |
Config smtp_host | /airflow/config/smtp_host |
Storing Connections in Akeyless
Store the connection secret value in one of these formats:
URI format:
postgresql://user:password@host:5432/dbnameJSON format:
{
"conn_type": "postgres",
"host": "db.example.com",
"login": "admin",
"password": "secret",
"schema": "mydb",
"port": 5432
}JSON with conn_uri:
{
"conn_uri": "postgresql://user:password@host:5432/dbname"
}Cloud-Based Authentication
For AWS IAM, GCP, or Azure AD, omit access_key and set the appropriate access_type. The provider uses the workload's cloud identity automatically.
Secrets Backend limitation:AkeylessBackendonly supportsapi_keyanduidauthentication. For cloud-based authentication (AWS IAM, GCP, Azure AD) useAkeylessHookdirectly in your DAGs.
Example using AWS IAM with the hook:
from airflow.providers.akeyless.hooks.akeyless import AkeylessHook
hook = AkeylessHook(akeyless_conn_id="akeyless_aws_iam")
value = hook.get_secret_value("/my/secret")Set the connection access_type extra field to aws_iam and install the cloud_id extras. The hook authenticates using the workload's AWS IAM identity (EC2 instance profile, ECS task role, and so on) — no static credentials required.
Troubleshooting
ImportError: akeyless_cloud_id is required
ImportError: akeyless_cloud_id is requiredYou are using aws_iam, gcp, or azure_ad authentication without the cloud ID extras package. Install it:
pip install apache-airflow-providers-akeyless[cloud_id]ValueError: Unsupported access_type for AkeylessBackend
ValueError: Unsupported access_type for AkeylessBackendAkeylessBackend only supports api_key and uid. For cloud-based authentication in the Secrets Backend, use AkeylessHook directly in your DAGs instead.
Secret not found when using Secrets Backend
Verify that the secret path in Akeyless matches the expected naming convention: <base_path>/<key>. For example, a Connection with conn_id = postgres_default is looked up at <connections_path>/postgres_default. Confirm the path and value are present in Akeyless, then restart Airflow for the configuration to take effect.
Authentication fails with 401 Unauthorized
401 Unauthorized- For
api_key: confirm the Access ID and Access Key fields in the connection are correct. - For
uid: confirm theuid_tokenvalue is valid and not expired. - For cloud-based authentication methods: confirm the workload has the expected IAM role or service account attached.
Updated about 18 hours ago
