diff --git a/README.md b/README.md index b4d4d99e0..19e86f676 100644 --- a/README.md +++ b/README.md @@ -2,7 +2,7 @@ Logo

- + License Downloads @@ -13,179 +13,53 @@ From the [Elementary](https://www.elementary-data.com/) team, helping you deliver trusted data in the AI era. Ranked among the top 5 dbt packages and supported by a growing community of thousands. -### Table of Contents +> **Need data reliability at scale?** The Elementary dbt package is also the foundation for **[Elementary Cloud](https://docs.elementary-data.com/cloud/introduction)** — a full Data & AI Control Plane with automated ML monitoring, column-level lineage from ingestion to BI and AI assets, a built-in catalog, and AI agents that scale reliability workflows for engineers and business users. [Book a demo →](https://meetings-eu1.hubspot.com/joost-boonzajer-flaes/intro-call-sl-) -- [**What's Inside the Elementary dbt Package?**](#whats-inside-the-elementary-dbt-package) -- [**Get more out of Elementary dbt package**](#get-more-out-of-elementary-dbt-package) -- [**Anomaly Detection Tests**](#anomaly-detection-tests) -- [**Schema Tests**](#schema-tests) -- [**Elementary Tables - Run Results and dbt Artifacts**](#elementary-tables---run-results-and-dbt-artifacts) -- [**AI-powered data validation and unstructured data tests**](#ai-powered-data-validation-and-unstructured-data-tests) -- [**Quickstart - dbt Package**](#quickstart---dbt-package) -- [**Community & Support**](#community--support) -- [**Contributions**](#contributions) +--- -## **What's Inside the Elementary dbt Package?** +## What it does -The **Elementary dbt package** is designed to enhance data observability within your dbt workflows. It includes two core components: +The package has two core components: -- **Elementary Tests** – A collection of **anomaly detection tests** and other data quality checks that help identify unexpected trends, missing data, or schema changes directly within your dbt runs. -- **Metadata & Test Results Tables** – The package automatically generates and updates **metadata tables** in your data warehouse, capturing valuable information from your dbt runs and test results. These tables act as the backbone of your **observability setup**, enabling **alerts and reports** when connected to an Elementary observability platform. +**1. Elementary Tables** +Using dbt's on-run-end hook, the package automatically parses your dbt artifacts and run results and loads them as structured tables into your warehouse. This includes: +- **Metadata tables** — models, tests, sources, exposures, columns, seeds, snapshots, and more +- **Run results tables** — invocations, model run results, test results, source freshness, and job-level outcomes -## Get more out of Elementary dbt package +These tables are the backbone of any observability setup — enabling alerts, reports, and lineage when connected to Elementary OSS or Cloud. → [See full table reference](https://docs.elementary-data.com/data-tests/dbt/package-models) -The **Elementary dbt package** helps you find anomalies in your data and build metadata tables from your dbt runs and tests—but there's even more you can do. +**2. Elementary Tests** +A suite of anomaly detection and data quality tests that run like native dbt tests — no separate tooling. Covers volume, freshness, column distributions, schema changes, and AI-powered validation for structured and unstructured data. → [See all tests](https://docs.elementary-data.com/data-tests/introduction) -To generate observability reports, send alerts, and govern your data quality effectively, connect your dbt package to one of the following options: +--- -- **Elementary OSS** - **A self-maintained, open-source CLI** that integrates seamlessly with your dbt project and the Elementary dbt package. It **enables alerting and provides the self-hosted Elementary data observability report**, offering a comprehensive view of your dbt runs, all dbt test results, data lineage, and test coverage. Quickstart [here](https://docs.elementary-data.com/oss/quickstart/quickstart-cli), and our team and community can provide great support on [Slack](https://www.elementary-data.com/community) if needed. -- **Elementary Cloud** - A managed, AI-driven control plane for observability, quality, governance, and discovery. It includes automated ML monitoring, column-level lineage from source to BI, a built-in catalog, and AI agents that scale reliability workflows. Cloud supports both engineers and business users, enabling technical depth and simple self-service in one place. To learn more, [book a demo](https://cal.com/maayansa/elementary-intro-github-package) or [start a trial](https://www.elementary-data.com/signup). +## Quickstart - - - - -## Data Anomaly Detection & Schema changes as dbt Tests - -**Elementary tests are configured and executed like native tests in your project!** - -Elementary dbt tests help track and alert on schema changes as well as key metrics and metadata over time, including freshness, volume, distribution, cardinality, and more. - -**Seamlessly configured and run like native dbt tests,** Elementary tests detect anomalies and outliers, helping you catch data issues early. - -Example of an Elementary test config in `schema.yml`: - -``` - -models: - - name: all_events - config: - elementary: - timestamp_column: 'loaded_at' - columns: - - name: event_count - tests: - - elementary.column_anomalies: - column_anomalies: - - average - where_expression: "event_type in ('event_1', 'event_2') and country_name != 'unwanted country'" - anomaly_sensitivity: 2 - time_bucket: - period: day - count:1 - -``` - -Elementary tests include: - -### **Anomaly Detection Tests** - -- **Volume anomalies -** Monitors the row count of your table over time per time bucket. -- **Freshness anomalies -** Monitors the freshness of your table over time, as the expected time between data updates. -- **Event freshness anomalies -** Monitors the freshness of event data over time, as the expected time it takes each event to load - that is, the time between when the event actually occurs (the **`event timestamp`**), and when it is loaded to the database (the **`update timestamp`**). -- **Dimension anomalies -** Monitors the count of rows grouped by given **`dimensions`** (columns/expressions). -- **Column anomalies -** Executes column level monitors on a certain column, with a chosen metric. -- **All columns anomalies** - Executes column level monitors and anomaly detection on all the columns of the table. - -### **Schema Tests** - -- **Schema changes -** Alerts on a deleted table, deleted or added columns, or change of data type of a column. -- **Schema changes from baseline** - Checks for schema changes against baseline columns defined in a source’s or model’s configuration. -- **JSON schema** - Allows validating that a string column matches a given JSON schema. -- **Exposure validation test -** Detects changes in your models’ columns that break downstream exposure. - -Read more about the available [Elementary tests and configuration](https://docs.elementary-data.com/data-tests/introduction). - -## Elementary Tables - Run Results and dbt Artifacts - -The **Elementary dbt package** automatically stores **dbt artifacts and run results** in your data warehouse, creating structured tables that provide visibility into your dbt runs and metadata. +→ [docs.elementary-data.com/data-tests/dbt/quickstart-package](https://docs.elementary-data.com/data-tests/dbt/quickstart-package) -### **Metadata Tables - dbt Artifacts** +--- -These tables provide a comprehensive view of your dbt project structure and configurations: +## See it in action -- **dbt_models** – Details on all dbt models. -- **dbt_tests** – Stores information about dbt tests. -- **dbt_sources** – Tracks source tables and freshness checks. -- **dbt_exposures** – Logs downstream data usage. -- **dbt_metrics** – Captures dbt-defined metrics. -- **dbt_snapshots** – Stores historical snapshot data. -- **dbt_seeds -** Stores current metadata about seed files in the dbt project. -- **dbt_columns** - Stores detailed information about columns across the dbt project. - -### **Run Results Tables** - -These tables track execution details, test outcomes, and performance metrics from your dbt runs: - -- **dbt_run_results** – Captures high-level details of each dbt run. -- **model_run_results** – Stores execution data for dbt models. -- **snapshot_run_results** – Logs results from dbt snapshots. -- **dbt_invocations** – Tracks each instance of dbt being run. -- **elementary_test_results** – Consolidates all dbt test results, including Elementary anomaly tests. - -For a full breakdown of these tables, see the [documentation](https://docs.elementary-data.com/dbt/package-models). - -## AI-powered data validation and unstructured data tests - -Elementary leverages AI to enhance data reliability with natural language test definitions: - -- **AI data validation**: Define expectations in plain English to validate structured data -- **Unstructured data validation**: Validate text, JSON, and other non-tabular data types - -Example: - -```yml -# AI data validation example -models: - - name: crm - description: "A table containing contract details." - columns: - - name: contract_date - description: "The date when the contract was signed." - tests: - - elementary.ai_data_validation: - expectation_prompt: "There should be no contract date in the future" -``` - -Learn more in our [AI data validations documentation](https://docs.elementary-data.com/data-tests/ai-data-tests/ai_data_validations). - -## Quickstart - dbt Package - -1. Add to your `packages.yml`: - -``` -packages: - - package: elementary-data/elementary - version: 0.23.1 - ## Docs: - -``` + + + -2. Run `dbt deps` -3. Add to your `dbt_project.yml`: +--- -``` -models: - ## elementary models will be created in the schema '_elementary' - ## for details, see docs: - elementary: - +schema: "elementary" +## Get the most out of the dbt package -``` +The dbt package works standalone, and integrates with both: -4. Run `dbt run --select elementary` +- **[Elementary OSS](https://docs.elementary-data.com/oss/oss-introduction)** — Self-hosted CLI for alerts and a local observability report. +- **[Elementary Cloud](https://docs.elementary-data.com/cloud/introduction)** — A full Data & AI Control Plane with automated ML monitoring, column-level lineage from ingestion to BI and AI assets, a built-in catalog, and AI agents that scale reliability workflows for engineers and business users. [Start a trial →](https://www.elementary-data.com/signup) or [book a demo →](https://meetings-eu1.hubspot.com/joost-boonzajer-flaes/intro-call-sl-) -Check out the [full documentation](https://docs.elementary-data.com/). +--- ## Community & Support -- [Slack](https://join.slack.com/t/elementary-community/shared_invite/zt-uehfrq2f-zXeVTtXrjYRbdE_V6xq4Rg) (Talk to us, support, etc.) -- [GitHub issues](https://github.com/elementary-data/elementary/issues) (Bug reports, feature requests) - -## Contributions - -Thank you :orange_heart: Whether it's a bug fix, new feature, or additional documentation - we greatly appreciate contributions! +- [Slack community](https://join.slack.com/t/elementary-community/shared_invite/zt-3s3uv8znb-7eBuG~ApwOa637dpVFo9Yg) — questions, team and AI support, and conversation +- [GitHub Issues](https://github.com/elementary-data/elementary/issues) — bug reports and feature requests +- [elementary-data.com](https://www.elementary-data.com/) — product, use cases, and more -Check out the [contributions guide](https://docs.elementary-data.com/oss/general/contributions) and [open issues](https://github.com/elementary-data/elementary/issues) in the main repo. +Contributions are always welcome. See the [contributions guide](https://docs.elementary-data.com/oss/general/contributions) to get started. 🧡