Skills from astronomer/agents
17 skills available
airflow
This skill manages Apache Airflow operations via the `af` CLI and `uvx` wrapper. It runs shell commands (e.g., `uvx --from astro-airflow-mcp af`, `af instance discover --scan`), requires credentials (`AIRFLOW_API_TOKEN`, `AIRFLOW_AUTH_TOKEN`), and makes network calls to `https://airflow.example.com`.
256 installs
authoring-dags
Workflow and best practices for writing Apache Airflow DAGs. Use when the user wants to create a new DAG, write pipeline code, or asks about DAG patterns and conventions. For testing and debugging DAGs, see the testing-dags skill.
231 installs
tracing-upstream-lineage
Trace upstream data lineage. Use when the user asks where data comes from, what feeds a table, upstream dependencies, data sources, or needs to understand data origins.
227 installs
debugging-dags
Comprehensive DAG failure diagnosis and root cause analysis. Use for complex debugging requests requiring deep investigation like "diagnose and fix the pipeline", "full root cause analysis", "why is this failing and how to prevent it". For simple debugging ("why did dag fail", "show logs"), the airflow entrypoint skill handles it directly. This skill provides structured investigation and prevention recommendations.
225 installs
analyzing-data
High-risk skill: executes arbitrary Python and shell commands in a Jupyter kernel (`kc.execute`, `subprocess.run`) and can install packages (`uv pip install`). It loads configs/credentials from `~/.astro/agents/warehouse.yml` and `.env` and injects secrets into the kernel via env vars like `GOOGLE_APPLICATION_CREDENTIALS` and `SF_KEY`.
225 installs
testing-dags
Complex DAG testing workflows with debugging and fixing cycles. Use for multi-step testing requests like "test this dag and fix it if it fails", "test and debug", "run the pipeline and troubleshoot issues". For simple test requests ("test dag", "run dag"), the airflow entrypoint skill handles it directly. This skill is for iterative test-debug-fix cycles.
221 installs
tracing-downstream-lineage
Trace downstream data lineage and impact analysis. Use when the user asks what depends on this data, what breaks if something changes, downstream dependencies, or needs to assess change risk before modifying a table or DAG.
221 installs
migrating-airflow-2-to-3
This skill guides migrating Airflow 2.x DAGs and code to Airflow 3.x. It instructs running `ruff check --preview --select AIR --fix --unsafe-fixes .` and using `DEPLOYMENT_API_TOKEN` / `AIRFLOW__API__BASE_URL` to call `https://<your-org>.astronomer.run/<deployment>/` via the REST API.
217 installs
checking-freshness
Quick data freshness check. Use when the user asks if data is up to date, when a table was last updated, if data is stale, or needs to verify data currency before using it.
210 installs
profiling-tables
Deep-dive data profiling for a specific table. Use when the user asks to profile a table, wants statistics about a dataset, asks about data quality, or needs to understand a table's structure and content. Requires a table name.
206 installs
managing-astro-local-env
Manage local Airflow environment with Astro CLI. Use when the user wants to start, stop, or restart Airflow, view logs, troubleshoot containers, or fix environment issues. For project setup, see setting-up-astro-project.
205 installs
setting-up-astro-project
This skill initializes and configures Astro/Airflow projects with commands like `astro dev init` and configuration files such as `airflow_settings.yaml`. It directs running CLI steps (`astro dev object export --connections --file connections.yaml`), pulls packages from `https://pypi.example.com/simple`, and shows connection credentials (`password`).
201 installs
annotating-task-lineage
Annotate Airflow tasks with data lineage using inlets and outlets. Use when the user wants to add lineage metadata to tasks, specify input/output datasets, or enable lineage tracking for operators without built-in OpenLineage extraction.
162 installs
creating-openlineage-extractors
Create custom OpenLineage extractors for Airflow operators. Use when the user needs lineage from unsupported or third-party operators, wants column-level lineage, or needs complex extraction logic beyond what inlets/outlets provide.
156 installs
cosmos-dbt-core
This skill converts a dbt Core project into an Airflow DAG/TaskGroup using Astronomer Cosmos and supplies configuration examples and code snippets. It instructs running dbt commands (`dbt build`, `dbt deps`), using env vars like `AIRFLOW__COSMOS__REMOTE_TARGET_PATH`, and interacting with cloud storage (`s3://bucket/target_dir/`).
155 installs
airflow-hitl
Use when the user needs human-in-the-loop workflows in Airflow (approval/reject, form input, or human-driven branching). Covers ApprovalOperator, HITLOperator, HITLBranchOperator, HITLEntryOperator. Requires Airflow 3.1+. Does not cover AI/LLM calls (see airflow-ai).
153 installs
cosmos-dbt-fusion
Dangerous skill: instructs execution of a remote installer via `curl -fsSL https://public.cdn.getdbt.com/fs/install/install.sh | sh` and other shell commands (`pip install`, `pip show`). It references local paths like `/home/astro/.local/bin/dbt` and environment variables such as `AIRFLOW__COSMOS__PRE_DBT_FUSION`.
149 installs