Skip to main content
If you already describe your warehouse in dbt, you already have half of an AnomalyArmor contract. This guide walks through converting a dbt project’s schema.yml files into ODCS YAML and applying it with armor contract apply.
dbt schema.yml translated to ODCS YAML and applied to AnomalyArmor
Install the CLI first (see Data Contracts for the auth step):
pip install anomalyarmor-cli
armor auth login --key aa_live_your_key_here

One-line pipeline

armor migrate-from dbt ./my-dbt-project/ | armor contract apply --asset <asset-uuid> -f -
The adapter writes ODCS YAML to stdout and contract apply reads from stdin when passed -f -. If the adapter maps zero models, it refuses to emit (non-zero exit) so the pipeline fails clean instead of clobbering your live config with an empty contract. Prefer a two-step flow for anything non-trivial:
# 1. Translate
armor migrate-from dbt ./my-dbt-project/ -o contracts/from-dbt.yaml

# 2. Preview
armor contract plan --asset <asset-uuid> -f contracts/from-dbt.yaml

# 3. Apply once the diff looks right
armor contract apply --asset <asset-uuid> -f contracts/from-dbt.yaml

What the adapter reads

Running armor migrate-from dbt <project> walks the project root recursively and parses every schema.yml / schema.yaml file. It skips:
  • dbt_packages/ and dbt_modules/ (vendored third-party packages, not your models).
  • node_modules/ (front-end tooling that sometimes coexists with dbt repos).
  • Any hidden directory (a path segment starting with .).
Your project root is the directory holding dbt_project.yml. The adapter does not call dbt compile or dbt parse, so you do not need a live target or a working profiles.yml. Static file scan only.

Mapping table

dbt inputODCS outputNotes
models[].nameschema[].nameTable identity.
models[].columns[].nameschema[].properties[].nameColumn identity.
models[].columns[].data_typeschema[].properties[].physicalTypeOnly emitted if present in the dbt file.
models[].columns[].descriptionschema[].properties[].descriptionNative ODCS.
models[].descriptionschema[].descriptionNative ODCS.
columns[].tests[].not_nullvalidity rule, rule_type=not_nullAA extension.
columns[].tests[].uniquevalidity rule, rule_type=uniqueAA extension.
columns[].tests[].accepted_values.valuesvalidity rule, rule_type=allowed_valuesNeeds a non-empty values list.
models[].config.freshnessfreshness schedule (hours)Best-effort: error_after preferred, falls back to warn_after. minute / hour / day periods convert to hours.
Everything the adapter understands lands in the emitted YAML. Everything else lands in the warnings section of the summary line so you know what to review.

What gets skipped (and why)

Input shapeReasonResolution
dbt_utils.* tests (e.g. dbt_utils.accepted_range)No 1:1 map to an AA validity ruleRe-author as a custom SQL check in AnomalyArmor, or add to the request queue for adapter coverage.
dbt_expectations.* testsToo many variants to map safelySame as above.
relationships testsReference integrity is not modeled by AA validity rulesTrack as a custom SQL check.
Any test with a namespaced name (contains .)Conservative skipAdapter errs on the side of warning over silent mistranslation.
accepted_values with empty or missing valuesWould emit an empty ruleFix the test in dbt, re-run.
freshness with a non-minute/hour/day periodCannot normalize to hoursNormalize in dbt or edit the emitted YAML directly.
dbt meta, tags, ownerNo round-trip design yetNot a data-quality signal; track with AA tags or leave in dbt.
The CLI summary prints a one-liner that groups these warnings so you see totals at a glance:
dbt -> ODCS (./my-dbt-project/): mapped 42, warnings 7 (dbt_utils.accepted_range: 3, ...).
If mapped_count is zero, the adapter exits non-zero and writes nothing to stdout.

Flags

FlagPurpose
--output, -oWrite YAML to a file instead of stdout. Pass - or omit for stdout.
--nameOverride the contract name field. Defaults to the project directory name.

Validate before applying

armor migrate-from dbt ./my-dbt-project/ -o from-dbt.yaml
armor contract validate -f from-dbt.yaml
contract validate runs the document against the ODCS v3.1.0 JSON Schema. Parse or schema errors print with file path, YAML path, and line number so editors and CI can surface them inline. No DB connection required beyond auth.

One dbt project, many assets

Today contract apply is asset-scoped (one asset per call). If your dbt project covers multiple warehouse tables:
  1. Translate once: armor migrate-from dbt ./proj -o full.yaml.
  2. Split by table or re-export per-asset from AnomalyArmor to get the target asset scoping, then apply each file against its asset.
Multi-asset bulk apply runs through the REST jobs endpoint today (see the bulk-apply section of the contracts guide). A CLI shortcut is on the roadmap.

See also