Thank you for contributing to Dagcellent or even considering it.
Also see Python Packaging User Guide
Install hatch with pipx and configure it. As a minimum, configure it to create venv in your project folder:
[dirs.env]
virtual = ".venv"- tests:
hatch run test:test - docs:
hatch run dev:docs
- Clone the repo
- Run
hatch env create && hatch env create dev && hatch env create test - Open VSCode
code . - Select the
devenvironment for development See the documentation.
To enable test discovery and test debugging, change the python interpreter path to a test environments path e.g. test.py3.11.
I suggest NeoVim or Zed. See the cheat-sheet below.
The docs is built into "sites" folder. This is gitignored and the docs is built in CI.
The latest version of the documentation is available on a github pages site, linked in the github repo.
To preview the documentation locally, you need to start mkdocs server. See Go to Cheat sheet
The deployment manifests and Dockerfile for the k8s hosted docs site is under k8s/
We release the package to a private package feed in ADO under "The Compass" project "compass" feed.
The project uses semantic versioning.
git-cliff is used for changelog generation.
We do not separate between developer and user changelog/news. The changelog is directly pulled from your git commits. Hence, it is necessary to write commits according to conventional commits.
The package follows semantic versioning. Breaking changes will occur unannounced before v1.0.0. After that all breaking changes will lead to bumping the major version number.
It is recommended to use Podman with the container files. The base Dockerfile can be used to run Airflow and install dagcellent in editable mode, so it gives you a short feedback loop.
The testing suite uses Pytest.
coming soon 👀
The CI will run integration tests, where external components are not mocked, but real containerized entities are used. All integration tests are marked with integration.
In general, prefer integration testing/system tests over mocking. E.g.: to guarantee that our tools work on various SQL engines, implement integration tests against those engines.
The following structure illustrates where to find the various integration test.
tests
├── dags
└── integration
├── mlflow
├── mssql
└── psql