DBX Docs: Fix order of storage access options; Marked option 3 as legacy#2200
DBX Docs: Fix order of storage access options; Marked option 3 as legacy#2200elazarlachkar wants to merge 953 commits intomasterfrom
Conversation
Update snippet syntax
- Update dbt package version from 0.19.1 to 0.20.0 in quickstart-package-install.mdx - Update Docker example to use v0.20.0 Version references now align with actual Elementary versions: - dbt package: 0.20.0 - CLI: 0.20.0
- Create /snippets/integrations/dbt-fusion.mdx with shared content - Update OSS dbt-fusion.mdx to use the snippet - Create cloud dbt-fusion.mdx to use the snippet - Add dbt Fusion card to transformation-and-orchestration-cards.mdx - Add dbt Fusion to docs.json navigation under Transformation & Orchestration - Both OSS and cloud pages now reference the same content for maintainability This branch is created fresh from the docs branch to ensure proper base.
Add dbt Fusion integration to cloud section with shared snippet
* catalog -> governance * add glue docs * BI small docs fixes * small fix to privatelink docs * small fixes * add glue and atlan icons + fix ask us link * doc fix
- Added release notes files for v0.20.0 through v0.16.0 - Formatted in the same style as existing release notes (0.17.0, 0.18.0) - Includes user-friendly summaries with emojis and links to GitHub releases
…sions-to-changelog Add missing OSS release notes
- Added release notes files for v0.20.0 through v0.16.0 - Formatted in the same style as existing release notes (0.17.0, 0.18.0) - Includes user-friendly summaries with emojis and links to GitHub releases
…sions-to-changelog Add missing OSS release notes
- Added all new release notes (0.20.0 through 0.16.0) to the navigation - Ordered in descending version order (newest first)
…sions-to-changelog Add missing release notes to navigation
- Moved release notes from nested group to separate group at bottom - Keeps release notes visible but separate from main package docs
- Release notes now appear at the bottom after 'Other Tests' - Set as collapsible and collapsed by default (hidden unless clicked)
- Created release notes for dbt package versions 0.20.1 through 0.18.1 - Formatted in the same style as OSS release notes - Includes user-friendly summaries with emojis and links to GitHub releases
…sions-to-changelog Add missing release versions to changelog
Made-with: Cursor
…e-to-product-docs Document table usage
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Update ci review docs
docs: lowercase secret names, remove api-url accordion
…ions-on-dbx-docs Update DBX permissions: Include cost; Reorganized Storage access
- State that dbt Cloud does not auto-populate job_name - Document JOB_NAME and DBT_JOB_NAME; steps for env configuration - Add --vars merge note, new-runs-only behavior, and SQL check Made-with: Cursor
|
👋 @elazarlachkar |
📝 WalkthroughWalkthroughThis PR comprehensively restructures the Elementary documentation system, migrating from a static structure to a modularized Mintlify-based platform. It introduces new Cloud platform documentation (AI agents, features, guides, integrations), reorganizes OSS documentation, adds Python SDK docs, refactors component includes from snippets to MDX imports, and adds tooling (pre-commit hook, Cursor rules, documentation guidelines). Changes
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~25 minutes Possibly related PRs
Poem
✨ Finishing Touches📝 Generate docstrings
🧪 Generate unit tests (beta)
|
| - repo: local | ||
| hooks: | ||
| - id: mintlify-validate | ||
| name: Mintlify validate | ||
| entry: bash -c "cd docs && mintlify broken-links | tee /dev/stderr | grep -q 'no broken links found'" | ||
| language: system | ||
| require_serial: true | ||
| pass_filenames: false |
There was a problem hiding this comment.
🟡 Pre-commit mintlify hook missing files filter blocks all commits for developers without mintlify
The new mintlify-validate pre-commit hook at .pre-commit-config.yaml:47-54 has no files filter, so it triggers on every commit regardless of which files changed. Since language: system requires mintlify (a Node.js CLI tool) to be available on PATH, any developer who follows the CONTRIBUTING.md instructions (pre-commit install) but doesn't have mintlify installed globally will have all commits blocked — even pure Python changes. The other existing hooks either target specific file patterns (e.g. files: ^elementary/.*\.py$ for mypy) or are language-agnostic. Adding files: ^docs/ would scope the hook to documentation changes only.
| - repo: local | |
| hooks: | |
| - id: mintlify-validate | |
| name: Mintlify validate | |
| entry: bash -c "cd docs && mintlify broken-links | tee /dev/stderr | grep -q 'no broken links found'" | |
| language: system | |
| require_serial: true | |
| pass_filenames: false | |
| - repo: local | |
| hooks: | |
| - id: mintlify-validate | |
| name: Mintlify validate | |
| entry: bash -c "cd docs && mintlify broken-links | tee /dev/stderr | grep -q 'no broken links found'" | |
| language: system | |
| require_serial: true | |
| pass_filenames: false | |
| files: ^docs/ |
Was this helpful? React with 👍 or 👎 to provide feedback.
There was a problem hiding this comment.
Actionable comments posted: 4
Note
Due to the large number of review comments, Critical severity comments were prioritized as inline comments.
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (3)
docs/cloud/integrations/dwh/bigquery.mdx (1)
2-3:⚠️ Potential issue | 🟡 MinorUse canonical product casing: “BigQuery”.
titleandsidebarTitlecurrently use “Bigquery”; please switch to “BigQuery” for naming consistency in docs UI.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@docs/cloud/integrations/dwh/bigquery.mdx` around lines 2 - 3, Update the document metadata to use correct product casing by changing the frontmatter fields title and sidebarTitle from "Bigquery" to "BigQuery" (locate the title and sidebarTitle entries in docs/cloud/integrations/dwh/bigquery.mdx and update their values accordingly).docs/data-tests/anomaly-detection-tests/event-freshness-anomalies.mdx (1)
17-17:⚠️ Potential issue | 🟡 MinorUse “complements” instead of “compliments.”
Line 17 should use “complements” (adds to), not “compliments” (praises).
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@docs/data-tests/anomaly-detection-tests/event-freshness-anomalies.mdx` at line 17, Replace the incorrect word "compliments" with "complements" in the sentence "This test compliments the `freshness_anomalies` test and is primarily intended for data that is updated in a continuous / streaming fashion." so it reads "This test complements the `freshness_anomalies` test..." to correct the usage; update the markdown content where that exact sentence appears.docs/data-tests/anomaly-detection-tests/all-columns-anomalies.mdx (1)
16-19:⚠️ Potential issue | 🔴 CriticalFix parameter name on line 16: use
column_anomalies(singular), notcolumns_anomalies(plural).Line 16 incorrectly refers to
columns_anomaliesconfiguration, but the correct parameter name iscolumn_anomalies(singular). This is confirmed by the configuration page, the HTML reference table on line 34, and all YAML examples in the file and throughout the codebase.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@docs/data-tests/anomaly-detection-tests/all-columns-anomalies.mdx` around lines 16 - 19, Replace the incorrect plural parameter name "columns_anomalies" with the correct singular "column_anomalies" in the documentation text so it matches the rest of the codebase and YAML examples; search for the phrase "columns_anomalies" in the doc (the sentence that currently reads “configured using the `columns_anomalies` configuration”) and update it to use `column_anomalies` instead.
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 37ba146b-4743-4ed3-ad35-e47fe2177d8a
⛔ Files ignored due to path filters (1)
docs/elementary_orange_favicon.pngis excluded by!**/*.png
📒 Files selected for processing (299)
.cursor/rules/docs.mdc.gitignore.pre-commit-config.yamldev-requirements.txtdocs/CLAUDE.mddocs/Dockerfiledocs/_snippets/cloud/integrations/athena.mdxdocs/_snippets/cloud/integrations/cards-groups/bi-cards.mdxdocs/_snippets/cloud/integrations/cards-groups/cloud-integrations-cards.mdxdocs/_snippets/cloud/integrations/cards-groups/connect-dwh-cards.mdxdocs/_snippets/cloud/integrations/databricks.mdxdocs/_snippets/dwh/bigquery/cli_service_account.mdxdocs/_snippets/dwh/bigquery/cloud_service_account.mdxdocs/_snippets/dwh/databricks/create_service_principal.mdxdocs/_snippets/faq/question-schema.mdxdocs/_snippets/faq/question-test-results-sample.mdxdocs/_snippets/oss/oss-introduction-opening.mdxdocs/_snippets/oss/oss-introduction.mdxdocs/_snippets/products-cards.mdxdocs/_snippets/profiles/all-profiles.mdxdocs/changelog.mdxdocs/cloud/ai-agents/catalog-agent.mdxdocs/cloud/ai-agents/governance-agent.mdxdocs/cloud/ai-agents/overview.mdxdocs/cloud/ai-agents/performance-cost-agent.mdxdocs/cloud/ai-agents/test-recommendation-agent.mdxdocs/cloud/ai-agents/triage-resolution-agent.mdxdocs/cloud/best-practices/detection-and-coverage.mdxdocs/cloud/best-practices/governance-for-observability.mdxdocs/cloud/best-practices/introduction.mdxdocs/cloud/best-practices/triage-and-response.mdxdocs/cloud/cloud-vs-oss.mdxdocs/cloud/features.mdxdocs/cloud/features/alerts-and-incidents/alert-configuration.mdxdocs/cloud/features/alerts-and-incidents/alert-destinations.mdxdocs/cloud/features/alerts-and-incidents/alert-rules.mdxdocs/cloud/features/alerts-and-incidents/alerts-and-incidents-overview.mdxdocs/cloud/features/alerts-and-incidents/incident-digest.mdxdocs/cloud/features/alerts-and-incidents/incident-management.mdxdocs/cloud/features/alerts-and-incidents/incidents.mdxdocs/cloud/features/alerts-and-incidents/owners-and-subscribers.mdxdocs/cloud/features/anomaly-detection/automated-freshness.mdxdocs/cloud/features/anomaly-detection/automated-monitors.mdxdocs/cloud/features/anomaly-detection/automated-volume.mdxdocs/cloud/features/anomaly-detection/metrics.mdxdocs/cloud/features/anomaly-detection/monitor-dwh-assets.mdxdocs/cloud/features/anomaly-detection/monitors-configuration.mdxdocs/cloud/features/anomaly-detection/monitors-feedback.mdxdocs/cloud/features/anomaly-detection/monitors-overview.mdxdocs/cloud/features/ci.mdxdocs/cloud/features/collaboration-and-communication/audit_logs/overview.mdxdocs/cloud/features/collaboration-and-communication/audit_logs/system-logs.mdxdocs/cloud/features/collaboration-and-communication/audit_logs/user-activity-logs.mdxdocs/cloud/features/collaboration-and-communication/catalog.mdxdocs/cloud/features/collaboration-and-communication/data-health.mdxdocs/cloud/features/collaboration-and-communication/data-observability-dashboard.mdxdocs/cloud/features/collaboration-and-communication/data-quality-dimensions.mdxdocs/cloud/features/collaboration-and-communication/saved-views.mdxdocs/cloud/features/config-as-code.mdxdocs/cloud/features/data-governance/ai-descriptions.mdxdocs/cloud/features/data-governance/critical_assets.mdxdocs/cloud/features/data-governance/manage-metadata.mdxdocs/cloud/features/data-lineage/column-level-lineage.mdxdocs/cloud/features/data-lineage/exposures-lineage.mdxdocs/cloud/features/data-lineage/lineage.mdxdocs/cloud/features/data-lineage/semantic-layer.mdxdocs/cloud/features/data-tests/custom-sql-tests.mdxdocs/cloud/features/data-tests/data-tests-overview.mdxdocs/cloud/features/data-tests/dbt-tests.mdxdocs/cloud/features/data-tests/schema-validation-test.mdxdocs/cloud/features/data-tests/test-coverage-screen.mdxdocs/cloud/features/elementary-alerts.mdxdocs/cloud/features/gpg-signed-commits.mdxdocs/cloud/features/integrations.mdxdocs/cloud/features/multi-env.mdxdocs/cloud/features/performance-monitoring/performance-alerts.mdxdocs/cloud/features/performance-monitoring/performance-monitoring.mdxdocs/cloud/features/roles-and-permissions.mdxdocs/cloud/features/table-usage.mdxdocs/cloud/general/ai-privacy-policy.mdxdocs/cloud/general/security-and-privacy.mdxdocs/cloud/guides/alert-rules.mdxdocs/cloud/guides/alerts-configuration.mdxdocs/cloud/guides/collect-job-data.mdxdocs/cloud/guides/collect-source-freshness.mdxdocs/cloud/guides/dev-prod-configuration.mdxdocs/cloud/guides/enable-slack-alerts.mdxdocs/cloud/guides/reduce-on-run-end-time.mdxdocs/cloud/guides/set-up-elementary-checklist.mddocs/cloud/guides/set-up-elementary.mdxdocs/cloud/guides/start-using-elementary-checklist.mddocs/cloud/guides/start-using-elementary.mdxdocs/cloud/guides/sync-scheduling.mdxdocs/cloud/guides/troubleshoot.mdxdocs/cloud/integrations/alerts/email.mdxdocs/cloud/integrations/alerts/jira.mdxdocs/cloud/integrations/alerts/linear.mdxdocs/cloud/integrations/alerts/ms-teams.mdxdocs/cloud/integrations/alerts/opsgenie.mdxdocs/cloud/integrations/alerts/pagerduty.mdxdocs/cloud/integrations/alerts/servicenow.mdxdocs/cloud/integrations/alerts/slack.mdxdocs/cloud/integrations/bi/connect-bi-tool.mdxdocs/cloud/integrations/bi/explo.mdxdocs/cloud/integrations/bi/hex.mdxdocs/cloud/integrations/bi/lightdash.mdxdocs/cloud/integrations/bi/looker.mdxdocs/cloud/integrations/bi/mode.mdxdocs/cloud/integrations/bi/power-bi.mdxdocs/cloud/integrations/bi/sigma.mdxdocs/cloud/integrations/bi/tableau.mdxdocs/cloud/integrations/bi/thoughtspot.mdxdocs/cloud/integrations/code-repo/azure-devops.mdxdocs/cloud/integrations/code-repo/bitbucket.mdxdocs/cloud/integrations/code-repo/connect-code-repo.mdxdocs/cloud/integrations/code-repo/github.mdxdocs/cloud/integrations/code-repo/gitlab.mdxdocs/cloud/integrations/dwh/athena.mdxdocs/cloud/integrations/dwh/bigquery.mdxdocs/cloud/integrations/dwh/clickhouse.mdxdocs/cloud/integrations/dwh/databricks.mdxdocs/cloud/integrations/dwh/dremio.mdxdocs/cloud/integrations/dwh/duckdb.mdxdocs/cloud/integrations/dwh/fabric.mdxdocs/cloud/integrations/dwh/postgres.mdxdocs/cloud/integrations/dwh/redshift.mdxdocs/cloud/integrations/dwh/snowflake.mdxdocs/cloud/integrations/dwh/spark.mdxdocs/cloud/integrations/dwh/sqlserver.mdxdocs/cloud/integrations/dwh/trino.mdxdocs/cloud/integrations/dwh/vertica.mdxdocs/cloud/integrations/elementary-integrations.mdxdocs/cloud/integrations/governance/atlan.mdxdocs/cloud/integrations/log-streaming/datadog.mdxdocs/cloud/integrations/log-streaming/gcs.mdxdocs/cloud/integrations/log-streaming/splunk.mdxdocs/cloud/integrations/metadata-layer/glue.mdxdocs/cloud/integrations/pipeline/fivetran.mdxdocs/cloud/integrations/security-and-connectivity/aws-privatelink-integration.mdxdocs/cloud/integrations/security-and-connectivity/ms-entra.mdxdocs/cloud/integrations/security-and-connectivity/okta.mdxdocs/cloud/integrations/transformation-and-orchestration/dbt-fusion.mdxdocs/cloud/introduction.mdxdocs/cloud/main_introduction.mdxdocs/cloud/manage-team.mdxdocs/cloud/mcp/intro.mdxdocs/cloud/mcp/mcp-tools.mdxdocs/cloud/mcp/overview.mdxdocs/cloud/mcp/recommended-rules.mdxdocs/cloud/mcp/setup-guide.mdxdocs/cloud/onboarding/connect-data-warehouse.mdxdocs/cloud/onboarding/quickstart-dbt-package.mdxdocs/cloud/onboarding/signup.mdxdocs/cloud/python-sdk/overview.mdxdocs/cloud/python-sdk/setup-guide.mdxdocs/cloud/python-sdk/usage-examples.mdxdocs/cloud/quickstart.mdxdocs/cloud/resources/ai-agents.mdxdocs/cloud/resources/business-case-data-observability-platform.mdxdocs/cloud/resources/community.mdxdocs/cloud/resources/how-does-elementary-work.mdxdocs/cloud/resources/pricing.mdxdocs/cloud/what-is-elementary.mdxdocs/culture.mdxdocs/data-tests/add-elementary-tests.mdxdocs/data-tests/ai-data-tests/ai_data_validations.mdxdocs/data-tests/ai-data-tests/supported-platforms/bigquery.mdxdocs/data-tests/ai-data-tests/supported-platforms/data-lakes.mdxdocs/data-tests/ai-data-tests/supported-platforms/databricks.mdxdocs/data-tests/ai-data-tests/supported-platforms/redshift.mdxdocs/data-tests/ai-data-tests/supported-platforms/snowflake.mdxdocs/data-tests/ai-data-tests/unstructured_data_validations.mdxdocs/data-tests/anomaly-detection-configuration/anomaly-direction.mdxdocs/data-tests/anomaly-detection-configuration/anomaly-exclude-metrics.mdxdocs/data-tests/anomaly-detection-configuration/anomaly-params.mdxdocs/data-tests/anomaly-detection-configuration/anomaly-sensitivity.mdxdocs/data-tests/anomaly-detection-configuration/column-anomalies.mdxdocs/data-tests/anomaly-detection-configuration/detection-delay.mdxdocs/data-tests/anomaly-detection-configuration/detection-period.mdxdocs/data-tests/anomaly-detection-configuration/dimensions.mdxdocs/data-tests/anomaly-detection-configuration/event_timestamp_column.mdxdocs/data-tests/anomaly-detection-configuration/exclude-final-results.mdxdocs/data-tests/anomaly-detection-configuration/exclude_detection_period_from_training.mdxdocs/data-tests/anomaly-detection-configuration/exclude_prefix.mdxdocs/data-tests/anomaly-detection-configuration/exclude_regexp.mdxdocs/data-tests/anomaly-detection-configuration/fail_on_zero.mdxdocs/data-tests/anomaly-detection-configuration/ignore_small_changes.mdxdocs/data-tests/anomaly-detection-configuration/seasonality.mdxdocs/data-tests/anomaly-detection-configuration/time-bucket.mdxdocs/data-tests/anomaly-detection-configuration/timestamp-column.mdxdocs/data-tests/anomaly-detection-configuration/training-period.mdxdocs/data-tests/anomaly-detection-configuration/update_timestamp_column.mdxdocs/data-tests/anomaly-detection-configuration/where-expression.mdxdocs/data-tests/anomaly-detection-tests-oss-vs-cloud.mdxdocs/data-tests/anomaly-detection-tests/Anomaly-troubleshooting-guide.mdxdocs/data-tests/anomaly-detection-tests/all-columns-anomalies.mdxdocs/data-tests/anomaly-detection-tests/column-anomalies.mdxdocs/data-tests/anomaly-detection-tests/dimension-anomalies.mdxdocs/data-tests/anomaly-detection-tests/event-freshness-anomalies.mdxdocs/data-tests/anomaly-detection-tests/freshness-anomalies.mdxdocs/data-tests/anomaly-detection-tests/volume-anomalies.mdxdocs/data-tests/data-freshness-sla.mdxdocs/data-tests/dbt/dbt-artifacts.mdxdocs/data-tests/dbt/dbt-package.mdxdocs/data-tests/dbt/on-run-end_hooks.mdxdocs/data-tests/dbt/package-models.mdxdocs/data-tests/dbt/quickstart-package.mdxdocs/data-tests/dbt/reduce-on-run-end-time.mdxdocs/data-tests/dbt/release-notes/releases/0.18.1.mdxdocs/data-tests/dbt/release-notes/releases/0.18.2.mdxdocs/data-tests/dbt/release-notes/releases/0.18.3.mdxdocs/data-tests/dbt/release-notes/releases/0.19.0.mdxdocs/data-tests/dbt/release-notes/releases/0.19.1.mdxdocs/data-tests/dbt/release-notes/releases/0.19.2.mdxdocs/data-tests/dbt/release-notes/releases/0.19.3.mdxdocs/data-tests/dbt/release-notes/releases/0.19.4.mdxdocs/data-tests/dbt/release-notes/releases/0.20.0.mdxdocs/data-tests/dbt/release-notes/releases/0.20.1.mdxdocs/data-tests/dbt/singular-tests.mdxdocs/data-tests/dbt/upgrade-package.mdxdocs/data-tests/execution-sla.mdxdocs/data-tests/how-anomaly-detection-works.mdxdocs/data-tests/introduction.mdxdocs/data-tests/python-tests.mdxdocs/data-tests/schema-tests/exposure-tests.mdxdocs/data-tests/schema-tests/json-schema.mdxdocs/data-tests/schema-tests/schema-changes-from-baseline.mdxdocs/data-tests/schema-tests/schema-changes.mdxdocs/data-tests/test-result-samples.mdxdocs/data-tests/volume-threshold.mdxdocs/data-tests/with-context-tests.mdxdocs/dbt/dbt-artifacts.mdxdocs/dbt/package-models.mdxdocs/docs.jsondocs/features/automated-monitors.mdxdocs/features/catalog.mdxdocs/features/ci.mdxdocs/features/column-level-lineage.mdxdocs/features/config-as-code.mdxdocs/features/data-tests.mdxdocs/features/elementary-alerts.mdxdocs/features/integrations.mdxdocs/features/lineage.mdxdocs/features/multi-env.mdxdocs/guides/modules-overview/dbt-package.mdxdocs/home.mdxdocs/kapa-widget.jsdocs/key-features.mdxdocs/mint.jsondocs/oss/cli-commands.mdxdocs/oss/cli-install.mdxdocs/oss/deployment-and-configuration/docker.mdxdocs/oss/deployment-and-configuration/elementary-in-production.mdxdocs/oss/deployment-and-configuration/slack.mdxdocs/oss/deployment-and-configuration/teams.mdxdocs/oss/general/community-and-support.mdxdocs/oss/general/faq.mdxdocs/oss/general/troubleshooting.mdxdocs/oss/guides/alerts/alerts-configuration.mdxdocs/oss/guides/alerts/elementary-alerts.mdxdocs/oss/guides/alerts/send-slack-alerts.mdxdocs/oss/guides/alerts/send-teams-alerts.mdxdocs/oss/guides/collect-dbt-source-freshness.mdxdocs/oss/guides/collect-job-data.mdxdocs/oss/guides/generate-report-ui.mdxdocs/oss/guides/performance-alerts.mdxdocs/oss/guides/reduce-on-run-end-time.mdxdocs/oss/integrations/athena.mdxdocs/oss/integrations/bigquery.mdxdocs/oss/integrations/clickhouse.mdxdocs/oss/integrations/databricks.mdxdocs/oss/integrations/dbt-fusion.mdxdocs/oss/integrations/dbt.mdxdocs/oss/integrations/dremio.mdxdocs/oss/integrations/duckdb.mdxdocs/oss/integrations/fabric.mdxdocs/oss/integrations/postgres.mdxdocs/oss/integrations/redshift.mdxdocs/oss/integrations/snowflake.mdxdocs/oss/integrations/spark.mdxdocs/oss/integrations/sqlserver.mdxdocs/oss/integrations/trino.mdxdocs/oss/integrations/vertica.mdxdocs/oss/oss-introduction.mdxdocs/oss/quickstart/quickstart-alerts.mdxdocs/oss/quickstart/quickstart-cli-package.mdxdocs/oss/quickstart/quickstart-cli.mdxdocs/oss/quickstart/quickstart-prod.mdxdocs/oss/quickstart/quickstart-report.mdxdocs/oss/quickstart/quickstart-support.mdxdocs/oss/quickstart/quickstart-tests.mdxdocs/oss/quickstart/stay-updated.mdxdocs/oss/release-notes/releases/0.16.0.mdxdocs/oss/release-notes/releases/0.16.1.mdxdocs/oss/release-notes/releases/0.16.2.mdxdocs/oss/release-notes/releases/0.17.0.mdxdocs/oss/release-notes/releases/0.18.0.mdxdocs/oss/release-notes/releases/0.18.1.mdxdocs/oss/release-notes/releases/0.18.2.mdx
💤 Files with no reviewable changes (30)
- dev-requirements.txt
- docs/_snippets/cloud/integrations/athena.mdx
- docs/_snippets/faq/question-schema.mdx
- docs/_snippets/products-cards.mdx
- docs/_snippets/cloud/integrations/databricks.mdx
- docs/_snippets/dwh/bigquery/cli_service_account.mdx
- docs/_snippets/dwh/bigquery/cloud_service_account.mdx
- docs/_snippets/oss/oss-introduction-opening.mdx
- docs/_snippets/oss/oss-introduction.mdx
- docs/features/config-as-code.mdx
- docs/_snippets/dwh/databricks/create_service_principal.mdx
- docs/cloud/guides/alert-rules.mdx
- docs/_snippets/cloud/integrations/cards-groups/connect-dwh-cards.mdx
- docs/_snippets/profiles/all-profiles.mdx
- docs/features/ci.mdx
- docs/mint.json
- docs/_snippets/faq/question-test-results-sample.mdx
- docs/features/multi-env.mdx
- docs/features/column-level-lineage.mdx
- docs/features/data-tests.mdx
- docs/features/elementary-alerts.mdx
- docs/_snippets/cloud/integrations/cards-groups/cloud-integrations-cards.mdx
- docs/_snippets/cloud/integrations/cards-groups/bi-cards.mdx
- docs/features/lineage.mdx
- docs/features/catalog.mdx
- docs/features/integrations.mdx
- docs/guides/modules-overview/dbt-package.mdx
- docs/features/automated-monitors.mdx
- docs/dbt/dbt-artifacts.mdx
- docs/dbt/package-models.mdx
| | Variable | Masked | Required | Description | | ||
| |---|---|---|---| | ||
| | `elementary_api_key` | Yes | Yes | Your Elementary Cloud API key | | ||
| | `elementary_env_id` | Yes | No | Your Elementary environment ID. Only needed when the repository is connected to multiple environments. | | ||
| | `gitlab_api_token` | Yes | No | Project Access Token with `api` scope. Only needed if you cannot enable CI/CD job token access (see below). | |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Description: Verify the variable names expected by the GitLab remote template
curl -s 'https://raw.githubusercontent.com/elementary-data/elementary-ci/v1/templates/mr-review.yml' | rg -i 'elementary_api_key|elementary_env_id|gitlab_api_token'Repository: elementary-data/elementary
Length of output: 321
Remove elementary_api_key from the variables table.
The remote GitLab template (https://raw.githubusercontent.com/elementary-data/elementary-ci/v1/templates/mr-review.yml) references only elementary_env_id and gitlab_api_token. The variable elementary_api_key does not appear in the template and should be removed from the documentation table.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@docs/cloud/features/ci.mdx` around lines 118 - 122, Remove the table row for
`elementary_api_key` from the variables table in docs/cloud/features/ci.mdx
because the remote GitLab template only uses `elementary_env_id` and
`gitlab_api_token`; update the table so it contains only the `elementary_env_id`
and `gitlab_api_token` rows (keep headers `Variable | Masked | Required |
Description`) and verify `elementary_env_id` and `gitlab_api_token` descriptions
remain unchanged.
| <Frame> | ||
| <img | ||
| src="https://res.cloudinary.com/diuctyblm/image/upload/v1701701243/elementary_cloud_data_observability_dashboard_dec_2024_oaos2y.png" | ||
| alt="Elementary Data Observability Dashboard" | ||
| /> | ||
| </Frame> | ||
| <img src="https://res.cloudinary.com/diuctyblm/image/upload/v1701701243/elementary_cloud_data_observability_dashboard_dec_2024_oaos2y.png" alt="Elementary Data Observability Dashboard" /> | ||
| </Frame> No newline at end of file |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Description: Check docs/skills.md for image formatting conventions
# Retrieve the image conventions section from skills.md
rg -A 10 -B 2 -i "image|<img>|<Frame>" docs/skills.mdRepository: elementary-data/elementary
Length of output: 1416
Add required caption attribute to <Frame> and move image to local storage.
The <Frame> tag is missing the required caption attribute per docs/skills.md guidelines ("All images must be wrapped in a <Frame> with a caption. No exceptions"). Also, the image must be stored locally in docs/pics/ instead of using external hosting. Update to:
<Frame caption="Elementary Data Observability Dashboard">
<img
src="/pics/elementary-data-observability-dashboard.png"
alt="Elementary Data Observability Dashboard"
/>
</Frame>Ensure caption and alt text are identical and the image is moved to the local directory.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In
`@docs/cloud/features/collaboration-and-communication/data-observability-dashboard.mdx`
around lines 9 - 11, The Frame element is missing the required caption and the
image is externally hosted; update the <Frame> tag to include a caption
attribute whose text matches the <img> alt text and replace the external src
with a local project images file (move the image into the repository’s docs
images folder and reference that local path in the <img> src); ensure the
caption value and the img alt are identical and the Frame wraps the img as
shown.
| Once the VPC endpoint service is successfully created, navigate to the "Allow principals tab" and click on "Allow Principals". Then add the following principal: | ||
| ``` | ||
| arn:aws:iam::743289191656:root | ||
| ``` | ||
|
|
||
| After the endpoint service finishes creating, navigate to the "Allow principals" section and select "Allow principals." Add Elementary's AWS account ID: `743289191656`. | ||
|
|
There was a problem hiding this comment.
🧩 Analysis chain
🌐 Web query:
For AWS VPC Endpoint Services ("Allow principals"), what formats are accepted in console/API: IAM principal ARN, AWS account ID, or both?
💡 Result:
For AWS VPC Endpoint Services (“Allow principals”), the accepted format in the console/API is IAM principal ARN (including the “account root” ARN form like arn:aws:iam:::root). The API parameter AddAllowedPrincipals/RemoveAllowedPrincipals are explicitly typed as principal ARNs, and the console entry field asks you to “enter the ARN of the principal.” AWS account ID only is not an accepted standalone format for endpoint service “Allow principals”; it must be provided as part of an ARN (e.g., arn:aws:iam:::root).
Citations:
- 1: https://docs.aws.amazon.com/AWSEC2/latest/APIReference/API_ModifyVpcEndpointServicePermissions.html
- 2: https://docs.aws.amazon.com/vpc/latest/privatelink/configure-endpoint-service.html
- 3: https://docs.aws.amazon.com/cli/latest/reference/ec2/modify-vpc-endpoint-service-permissions.html
🏁 Script executed:
cat docs/skills.mdRepository: elementary-data/elementary
Length of output: 11645
Use ARN format consistently for the principal — AWS only accepts IAM principal ARN, not raw account ID.
Lines 262–268 provide the principal in two different formats. AWS VPC Endpoint Services only accept IAM principal ARN format (arn:aws:iam::<account-id>:root); raw account ID alone is not accepted. Replace the second instruction to match the first.
Suggested fix
-After the endpoint service finishes creating, navigate to the "Allow principals" section and select "Allow principals." Add Elementary's AWS account ID: `743289191656`.
+After the endpoint service finishes creating, navigate to the "Allow principals" section and select "Allow principals." Add Elementary's principal:
+`arn:aws:iam::743289191656:root`🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In
`@docs/cloud/integrations/security-and-connectivity/aws-privatelink-integration.mdx`
around lines 262 - 268, The two instructions use inconsistent principal formats;
replace the raw account ID occurrence ("743289191656") with the IAM principal
ARN format used earlier ("arn:aws:iam::743289191656:root") so both places
instruct adding the principal as an IAM principal ARN accepted by AWS VPC
Endpoint Services; update the second sentence that currently says "Add
Elementary's AWS account ID: `743289191656`" to instead instruct adding the ARN
`arn:aws:iam::743289191656:root`.
| --- | ||
| title: "DuckDB" | ||
| --- | ||
|
|
||
| import DuckdbCli from '/snippets/cli/duckdb-cli.mdx'; | ||
|
|
||
|
|
||
|
|
||
| <Tabs> | ||
| <Tab title="Elementary CLI"> | ||
| <DuckdbCli /> | ||
| </Tab> | ||
| </Tabs> | ||
|
|
||
| ### Have a question? | ||
|
|
||
| We are available | ||
| on [Slack](https://elementary-data.com/community), reach out | ||
| for any kind of help! |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Description: Check if there are any Databricks-related documentation changes in this PR
# Search for DBX/Databricks documentation files
fd -e mdx -e md . docs/ --exec rg -l -i 'databricks|dbx|storage.access' {}Repository: elementary-data/elementary
Length of output: 1388
🏁 Script executed:
cat docs/skills.mdRepository: elementary-data/elementary
Length of output: 11645
This file does not belong in this PR, and if it does, it must be restructured per the integration documentation template.
The PR is titled "DBX Docs: Fix order of storage access options; Marked option 3 as legacy," which focuses on Databricks storage access configuration. This DuckDB integration file is unrelated to Databricks and appears to be included in the wrong PR.
If DuckDB integration documentation is intentionally part of this PR, the file must follow the integration page structure defined in docs/skills.md:
- Add a one-sentence summary after the frontmatter explaining what DuckDB integration enables
- Include an image wrapped in a
<Frame>with caption - Add an
## Enabling DuckDBsection with numbered setup steps (do not use<Tabs>with a single tab—<Tabs>is for platform variants like "Cloud vs Server") - Remove the undersized "Have a question?" footer in favor of proper section structure
- Add any capabilities, configuration options, or troubleshooting guidance as separate sections
Verify the file's intended placement and restructure accordingly.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@docs/oss/integrations/duckdb.mdx` around lines 1 - 19, This DuckDB doc
(containing DuckdbCli, <Tabs> and <Tab> usage) does not belong in this PR or is
not following the integration template; either remove it from this PR or
restructure it to match docs/skills.md: add a one-sentence summary after the
frontmatter describing what the DuckDB integration enables, include an image
inside a <Frame> with a caption, replace the single <Tabs> wrapper with a
top-level "## Enabling DuckDB" section containing numbered setup steps (avoid
<Tabs> for a single tab), remove the small "Have a question?" footer and instead
add proper sections for capabilities, configuration options, and
troubleshooting, and ensure the file follows the integration page structure
before committing.
| /> | ||
|
|
||
| - **Client ID**: The Application (client) ID of the service principal (the "Application ID" you copied in [step 5](#create-service-principal)). | ||
| - **Client secret**: The OAuth secret you generated for the service principal (see [step 7](#create-service-principal)). |
There was a problem hiding this comment.
🟡 Databricks OAuth secret references wrong step number (step 7 instead of step 6)
In the Databricks connection form documentation, the Client secret field references "step 7" for the OAuth secret generation, but the actual OAuth secret generation happens in step 6 of the create_service_principal.mdx snippet. Step 7 is about enabling predictive optimization (docs/snippets/dwh/databricks/create_service_principal.mdx:68), not generating credentials (docs/snippets/dwh/databricks/create_service_principal.mdx:35-66). This would direct users to the wrong step when setting up Databricks authentication.
| - **Client secret**: The OAuth secret you generated for the service principal (see [step 7](#create-service-principal)). | |
| - **Client secret**: The OAuth secret you generated for the service principal (see [step 6](#create-service-principal)). |
Was this helpful? React with 👍 or 👎 to provide feedback.
|
Closing in favor of #2201 |
The warning added on DBX looks like this:
Summary by CodeRabbit
Release Notes
New Features
Documentation