Skip to content
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
51 changes: 51 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,51 @@
# Changelog

All notable changes to Sift are documented here.

Format follows [Keep a Changelog](https://keepachangelog.com/en/1.1.0/).
Versions follow [Semantic Versioning](https://semver.org/spec/v2.0.0.html).

---

## [Unreleased]

_Changes staged for the next release go here during development._

---

## [0.1.0] — 2026-04-08

### Added

- Initial MVP release of Sift — a privacy-first, local-only clinical intelligence engine
- Tauri v2 host: Windows system tray, native folder-picker dialog, recursive file system watching via `notify` crate
- Node.js Express backend sidecar with SQLite (`better-sqlite3`) for document storage
- Ingestion pipelines for FHIR JSON, HL7 v2 (pipe-delimited), and PDF (text extraction via `pdf-parse`)
- OpenAI-compatible LLM client targeting local Ollama (`http://127.0.0.1:11434/v1`); heuristic fallback when LLM is unavailable
- React + Vite + Tailwind CSS frontend with pastel colour scheme
- Document list with real-time processing status, auto-polling, and trash-icon deletion
- Manual ingest: browse-button (native dialog) or typed path; "Scan watch folder" button
- Auto-ingest when a new watch folder is selected
- Custom `MarkdownText` component for rendering LLM-generated markdown summaries
- Print-to-PDF via hidden iframe (`printReport.ts`)
- Settings UI: LLM base URL, model name, "Use Ollama defaults" button
- `sift.mjs` developer CLI: `run`, `check`, `deps`, `debug`, `stop`, `package` subcommands
- Backend sidecar bundled as a Windows `.exe` via `pkg`; spawned and lifecycle-managed by Tauri in production builds
- Jest + ts-jest backend test suite (unit + integration with in-memory SQLite)
- Vitest + @testing-library/react frontend test suite (component + API client)
- GitHub Actions CI workflow: backend (Jest), frontend (Vitest + build), Rust (`cargo check`)
- GitHub Actions release workflow: automated Windows NSIS installer on version tag
- Cursor coding rules (TypeScript, Rust, security, testing) in `.cursor/rules/`
- Healthcare compliance skills (PHI, HIPAA, EMR, CDSS) in `.cursor/skills/`
- Sample FHIR, HL7, and PDF test files in `samples/`
- Full documentation suite: QUICKSTART, INSTALLATION, DEVELOPMENT, CONFIGURATION, ARCHITECTURE, BUILD-AND-RELEASE, TESTING, TROUBLESHOOTING, SECURITY-AND-COMPLIANCE, GLOSSARY, CONTRIBUTING

### Security

- All patient data processed locally; no network egress by design (HIPAA-safe by architecture)
- LLM inference via local Ollama — PHI never leaves the machine
Comment on lines +45 to +46
Copy link

Copilot AI Apr 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The Security bullets claim “no network egress by design” / “PHI never leaves the machine”, but the app supports configuring llm_base_url (UI + SIFT_LLM_BASE_URL) and the backend sends excerpts to whatever URL is configured. Please reword these lines to avoid implying a guarantee; e.g., clarify that the default is localhost/Ollama and that PHI may be transmitted to the configured LLM endpoint if it is not local.

Suggested change
- All patient data processed locally; no network egress by design (HIPAA-safe by architecture)
- LLM inference via local Ollama — PHI never leaves the machine
- Default configuration processes patient data locally and targets a local Ollama endpoint (`http://127.0.0.1:11434/v1`)
- When `llm_base_url` is set to a non-local endpoint, excerpts containing PHI may be transmitted to that configured LLM service

Copilot uses AI. Check for mistakes.

---

[Unreleased]: https://github.com/fleXRPL/sift/compare/v0.1.0...HEAD
[0.1.0]: https://github.com/fleXRPL/sift/releases/tag/v0.1.0
Loading