diff --git a/README.md b/README.md deleted file mode 100644 index 192875e..0000000 --- a/README.md +++ /dev/null @@ -1,98 +0,0 @@ -# DevSoc 2026 Hackathon -## Advanced LLM Reasoning & Verification Challenge - -[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](./LICENSE) -[![Status: Active](https://img.shields.io/badge/Status-Active-success)](https://github.com/topics/hackathon) - -> **Mission:** Build "GraphMind"β€”an advanced conversational AI that uses **Graph of Thoughts (GoT)** and **Mixture of Experts (MoE)** to provide verified answers using **ONLY** data scraped from [MetaKGP](https://wiki.metakgp.org/). - -## Schedule & Timeline -**Duration:** 5 Days (Monday - Friday) - -| Event | Date | Time | Details | -| :--- | :--- | :--- | :--- | -| **Kickoff** | **Mon, Jan 12** | 5:00 PM | Problem Release & Team Formation | -| **Code Freeze** | **Fri, Jan 16** | **12:00 PM** | **Submission Deadline (Strict)** | - -## 1. The Problem: "Trust, but Verify" - -Large Language Models (LLMs) often hallucinate when dealing with niche, institutional knowledge. They generate plausible-sounding but factually incorrect information because they lack real-time access to specific local data. - -### The Challenge -Your task is to build a chatbot that answers questions **strictly** using data you scrape from **MetaKGP / MetaWiki**. - -**Why this is hard:** -1. **No Pre-made Dataset:** You must build the pipeline to scrape, clean, and index the data yourself. -2. **Stale Data Risks:** LLMs have outdated internal knowledge about IIT Kharagpur; you must force them to use *your* scraped data. -3. **Hallucination:** If the scraper misses a page, the LLM might make something up. Your verification system must prevent this. - -## 2. Technical Pillars - -### Team Constraint -* **Team Size:** Strictly **4 Members** per team. - -### Core Requirements -Your solution **must** integrate these three techniques: - -#### 1. Data Pipeline (Scraping & Indexing) -* **Scraper:** You must write a script to crawl `wiki.metakgp.org` (and related MetaWiki pages). -* **Ingestion:** Clean the HTML/Wikitext and chunk it for retrieval (RAG). -* **Constraint:** **NO external datasets** allowed. If the answer isn't on MetaKGP, the bot should say "I don't know." - -#### 2. Graph of Thoughts (GoT) -Model reasoning as a directed graph. -* **Nodes:** Facts extracted from your scraped documents. -* **Edges:** Logical connections between different wiki pages. -* **Goal:** Connect disparate pieces of info (e.g., connect a *Society* page to a *Student* page). - -#### 3. Mixture of Experts (MoE) Verification -Implement specific verifiers that check against your scraped data: -1. **Source Matcher:** "Does the text in the retrieved chunk actually support this claim?" -2. **Hallucination Hunter:** "Is the bot inventing details not present in the scraped context?" -3. **Logic Expert:** "Does the conclusion follow from the premises?" - -## 3. Expected System Behavior - -### Example Query -**User:** *"Who are the governors of the Technology Literary Society?"* - -**System Output:** -* **Step 1 (Scrape/Retrieve):** System searches vector store for "Technology Literary Society governors". -* **Step 2 (Reasoning Paths):** - * *Path A:* Claims "John Doe" (Based on 2018 data). -> **Context Expert:** Outdated. - * *Path B:* Claims "Jane Smith" (Based on hallucination). -> **Source Matcher:** Citation missing. - * *Path C:* Claims "Current Governors listed in 2025 section". -> **Source Matcher:** Verified. -* **Step 3 (Final Answer):** "The current governors are... [List]. (Source: MetaKGP/TLS_Page)" - -## 4. Rules & Constraints - -### Data Source Rules (Strict) -1. **Allowed Source:** ONLY `wiki.metakgp.org` (and associated MetaWiki domains). -2. **Forbidden:** Wikipedia, Google Search API, or pre-trained knowledge usage. -3. **Scraping:** You must implement the scraping logic. Using a pre-downloaded dump is **not allowed**β€”your code must show how data is fetched. - -### Tech Stack -* **Open Source Only:** (LangChain, Scrapy, BeautifulSoup, Selenium, etc.) -* **API Limits:** Stay within provided free tier limits ($50/team). - -## 5. Evaluation Criteria - -| Criteria | Points | Description | -| :--- | :--- | :--- | -| **Data Pipeline** | **30** | Effectiveness of the scraper, cleaning, and indexing strategies. | -| **Verification (MoE)** | **30** | Ability to detect and stop hallucinations using the experts. | -| **MetaKGP Fidelity** | **20** | **CRITICAL:** Answers must be traceable back to specific MetaKGP URLs. | -| **UX & Demo** | **20** | Working chatbot, citation links, and graph visualization. | - -## 6. How to Submit - -**Deadline:** Friday, Jan 16 @ 12:00 PM. - -1. **Fork** this repository. -2. Create a folder: `submissions/YOUR_TEAM_NAME`. -3. Include your **Scraper Code** and **Chatbot Code**. -4. Add a `README.md` using the [Submission Template](./SUBMISSION_TEMPLATE.md). -5. Open a **Pull Request (PR)** to the `main` branch. - -### Deliverables Checklist -* [ ] Source Code (Scraper + Bot). diff --git a/SUBMISSION_TEMPLATE.md b/SUBMISSION_TEMPLATE.md deleted file mode 100644 index 9939d64..0000000 --- a/SUBMISSION_TEMPLATE.md +++ /dev/null @@ -1,42 +0,0 @@ -# Team Name: [Insert Team Name] - -## πŸ‘₯ Team Members -* Member 1: [Name] - [GitHub Profile] -* Member 2: [Name] - [GitHub Profile] -* Member 3: [Name] - [GitHub Profile] -* Member 4: [Name] - [GitHub Profile] - -## πŸ”— Project Links -* **Demo Video:** [Link to YouTube/Loom (3-5 mins)] -* **Hosted Demo:** [Optional: Link to live app] - -## πŸ€– Technical Implementation - -### 1. Data Pipeline (Scraping) -* **Tools Used:** (e.g., Scrapy, Beautiful Soup, Selenium) -* **Strategy:** Briefly explain how you scraped `wiki.metakgp.org`. How did you handle cleaning and indexing? -* **Indexing:** (e.g., ChromaDB, FAISS, Pinecone) - -### 2. Graph of Thoughts (GoT) -* **Reasoning Model:** Explain how your nodes and edges are structured. -* **Graph Logic:** How do you connect different MetaKGP pages? (e.g., connecting a "Society" page to a "Student" page). - -### 3. Mixture of Experts (MoE) -* **Expert 1 (Source Matcher):** How do you verify the text exists in the scraped data? -* **Expert 2 (Hallucination Hunter):** How do you detect fabricated info? -* **Expert 3 (Logic Expert):** How do you ensure logical consistency? - -## πŸ“Š Setup Instructions -* **Prerequisites:** (e.g., Python 3.10, Neo4j) -* **Environment Variables:** List required `.env` keys (do not share actual keys). -* **How to Run Scraper:** `python scrape.py` -* **How to Run Bot:** `streamlit run app.py` - -## πŸ“Έ Screenshots -* - -[Image of Graph Visualization] - -* - -[Image of Chat Interface] diff --git a/submissions/team_2/.env.example b/submissions/team_2/.env.example new file mode 100644 index 0000000..a706473 --- /dev/null +++ b/submissions/team_2/.env.example @@ -0,0 +1,28 @@ +# Database Configuration (PostgreSQL) +DB_HOST=localhost +DB_PORT=5432 +DB_NAME=metakgp_content +DB_USER=your_username +DB_PASSWORD=your_password +DB_SSLMODE=require + +# Modal Services (Embeddings Only) +MODAL_URL=https://your-workspace--metakgp-embeddings-fastapi-app.modal.run + +# LLM API Keys +GROQ_API_KEY=your_groq_api_key_here + +# ChromaDB Configuration +CHROMA_DIR=./chroma_data +GOT_CACHE_DIR=./chroma_data/got_cache + + +# Backend API Server Configuration +HOST=0.0.0.0 +PORT=8000 + +# Cache Configuration +CACHE_DIR=./cache + +# Indexer Configuration +BATCH_SIZE=100 diff --git a/submissions/team_2/.gitignore b/submissions/team_2/.gitignore new file mode 100644 index 0000000..20fd30e --- /dev/null +++ b/submissions/team_2/.gitignore @@ -0,0 +1,3 @@ +*.env + +!.env.* \ No newline at end of file diff --git a/submissions/team_2/README.md b/submissions/team_2/README.md new file mode 100644 index 0000000..c1677c5 --- /dev/null +++ b/submissions/team_2/README.md @@ -0,0 +1,186 @@ +# Team Name: Team 2 + +## Team Members +* Member 1: Ruhaan Kakar - [invincible1786](https://github.com/invincible1786) +* Member 2: Pawan Manighadhan - [pawan188](https://github.com/pawan188) +* Member 3: Sakshi S. Dwivedi - [dwivedi-jiii](https://github.com/dwivedi-jiii) +* Member 4: Arsh Goyal - [arshGoyalDev](https://github.com/arshGoyalDev) + +## Technical Implementation + +### 1. Data Pipeline (Scraping) +* Tools used: Python scraper (mwclient, mwparserfromhell, BeautifulSoup), concurrent/multi-threaded scraper, batch output to JSON. +* Strategy: We fetch the full page list from MetaKGP, then scrape pages in concurrent batches. Wikitext is cleaned into readable Markdown, infoboxes are extracted into a summary block, and links/templates are normalized. +* Chunking & Indexing: Cleaned page text is chunked (configurable) and sent to the indexer which produces embeddings (Modal service) and stores vectors in ChromaDB for RAG. +* Key files: + - `submissions/team_2/scraper/src/main.py` β€” concurrent scraper and batch output + - `submissions/team_2/scraper/src/wikitext_cleaner.py` β€” converts wikitext β†’ cleaned Markdown + infobox extraction + - `submissions/team_2/scraper/results/` β€” contains `all_pages.json`, `scraped_data/` with cleaned pages + +### 2. Graph of Thoughts (GoT) +* Reasoning model: The system represents reasoning as a directed graph of "thought" nodes. Each node is derived from retrieved chunks (facts) from MetaKGP pages. Edges represent logical or topical connections discovered during retrieval and candidate generation. +* Node structure: extracted fact / claim text, source metadata (page title, chunk id, score), and a small provenance snippet. +* Edge logic: edges are created when two facts share strong semantic similarity, explicit links, or when the Logic Expert recommends merging nodes. This allows connecting a "Society" page to a related "Student" or "Event" page via intermediate facts. +* Key files: + - `submissions/team_2/chatbot/backend/src/services/chat_agent/engine.py` β€” orchestrates GoT construction and search + - `submissions/team_2/chatbot/backend/src/services/chat_agent/router.py` β€” exposes endpoints for the GoT-driven query flow + +### 3. Mixture of Experts (MoE) Verification +We implemented a small Gauntlet of three experts that verify each candidate thought before it is accepted into the reasoning graph or included in the final answer. + +1. Expert 1 β€” Source Matcher: + - Verifies whether the meaning of a thought is semantically contained in the retrieved context chunks. + - Returns a confidence score and a short reasoning string. + - Implemented in `src/services/chat_agent/experts.py` as `SourceMatcher`. + +2. Expert 2 β€” Hallucination Hunter: + - Detects claims in a thought that are unsupported by the context and lists significant unsupported claims. + - Implements JSON-based verdicts (PASS/FAIL) and a confidence metric. + - Implemented in `src/services/chat_agent/experts.py` as `HallucinationHunter`. + +3. Expert 3 β€” Logic Expert: + - Ensures the candidate thought fits coherently into the existing reasoning chain (graph history), flags redundancy, and recommends actions: keep | merge | discard. + - Implemented in `src/services/chat_agent/experts.py` as `LogicExpert`. + +Orchestration: `MoEGauntlet` runs the three experts in parallel and computes a weighted vote to decide whether a thought passes. The project uses a lenient weighted formula (source: 40%, hallucination: 30%, logic: 30%) and conservative pass thresholds so the system prioritizes recall while still surfacing provenance and failure reasons. + +## Setup Instructions + +### Prerequisites +* Python 3.11+ / 3.13+ (project tested with modern Python versions) +* Node 18+ (for the frontend dev server) +* PostgreSQL (optional but required for the indexer if using DB upload) +* Modal account (for GPU-accelerated embeddings service) +* Groq API key (required for LLM inference) - get from https://console.groq.com/ + +### Environment Variables + +**Important:** This project uses a **single unified `.env` file** at `submissions/team_2/.env` + +1. Copy the example configuration: +```bash +cd submissions/team_2 +cp .env.example .env +``` + +2. Edit `.env` with your credentials: +```bash +nano .env +``` + +Required variables: +```bash +# Database (PostgreSQL) +DB_HOST=localhost +DB_PORT=5432 +DB_NAME=metakgp_content +DB_USER=your_username +DB_PASSWORD=your_password +DB_SSLMODE=require + +# Modal Services (Embeddings only) +MODAL_URL=https://your-workspace--metakgp-embeddings-fastapi-app.modal.run + +# LLM API Keys (Required) +# Groq hosts Llama models for GoT reasoning and MoE verification +GROQ_API_KEY=your_groq_api_key_here + +# Backend Configuration (use default values) +CHROMA_DIR=./chroma_data +CACHE_DIR=./cache +HOST=0.0.0.0 +PORT=8000 +BATCH_SIZE=100 +``` + +**Note:** The scraper, backend, and all services read from this single `.env` file at the project root (`submissions/team_2/.env`). + +**LLM Architecture:** +- **Embeddings**: Modal-hosted `all-MiniLM-L6-v2` (for semantic search) +- **LLM Inference**: Groq-hosted `Llama-4-Scout-17b-16e` (for reasoning, MoE verification) + +### How to Run Scraper +1. Create and activate a Python venv in `submissions/team_2/scraper` and install requirements: + +``` +cd submissions/team_2/scraper +source venv/bin/activate +pip install -r requirements.txt +``` + +2. Fetch all page links (one-time): +``` +python src/fetch_all_links.py +``` + +3. Scrape pages (examples): +``` +# Quick test (10 pages) +python src/main.py results/all_pages.json --limit 10 + +# Batch scrape (100 pages in batches of 25) +python src/main.py results/all_pages.json --limit 100 --pages 25 --threads 8 + +# Full wiki (batches of 50) +python src/main.py results/all_pages.json --pages 50 --threads 4 +``` + +### How to Run Bot (Backend) +1. Setup environment and install dependencies: + +```bash +cd submissions/team_2/chatbot/backend +uv venv +uv pip install -e . +python -m spacy download en_core_web_sm +``` + +2. Ensure `.env` exists at project root with required variables (see Environment Variables section above) + +3. (Optional) Deploy Modal embedding service and set `MODAL_URL` in `.env`: + +```bash +modal setup +modal deploy src/utils/modal_embeddings.py +# copy the returned URL into MODAL_URL in submissions/team_2/.env +``` + +4. Start services: +```bash +./start.sh +# or run individual services: +# uvicorn src.app.main:app --reload +``` + +### How to Run Frontend +``` +cd submissions/team_2/chatbot/frontend +npm install +npm run dev +``` +The frontend expects the backend query route at `http://localhost:8000/got/query` by default. + +## πŸ“Έ Screenshots +* image + + +## Notes on Behavior & Constraints +* Data Source Rule: All answers must be traceable to scraped MetaKGP content. If the system cannot find supporting evidence it returns "I don't know." (enforced by the Source Matcher and Hallucination Hunter). +* Provenance: Final answers include page titles and short source snippets for traceability. +* Extensibility: The FastAPI app is modular; new experts or reasoning strategies can be added under `src/services/chat_agent/` and mounted via routers. + +## How we validated +* Scraper: sample runs (10/100 pages) and cleaned output checked in `submissions/team_2/scraper/results/scraped_data/`. +* Indexing: ChromaDB persisted vectors in `chroma_data/` and embedding calls are routed to Modal or a provided embedding endpoint. +* MoE: `MoEGauntlet` tests experts in parallel; the code is in `src/services/chat_agent/experts.py`. + +## Troubleshooting +* Check logs under `submissions/team_2/chatbot/backend/logs/` (indexer.log, query_service.log). +* Common fixes: ensure `.env` is populated, Modal URL points to a deployed embedding service, and PostgreSQL credentials are correct when using DB upload. + +## Deliverables Checklist +* [x] Scraper (source + results) +* [x] Backend (RAG indexer + FastAPI query service) +* [x] Frontend (React chat UI) +* [ ] Demo video (link placeholder) +* [ ] Hosted demo (optional) diff --git a/submissions/team_2/chatbot/backend/.gitignore b/submissions/team_2/chatbot/backend/.gitignore new file mode 100644 index 0000000..a2dd519 --- /dev/null +++ b/submissions/team_2/chatbot/backend/.gitignore @@ -0,0 +1,80 @@ +# Python +__pycache__/ +*.py[cod] +*$py.class +*.so +.Python +build/ +develop-eggs/ +dist/ +downloads/ +eggs/ +.eggs/ +lib/ +lib64/ +parts/ +sdist/ +var/ +wheels/ +*.egg-info/ +.installed.cfg +*.egg + +# Virtual Environment +venv/ +env/ +ENV/ +.venv + +# Environment Variables +.env +.env.local +.env.*.local + +# IDEs +.vscode/ +.idea/ +*.swp +*.swo +*~ +.DS_Store + +# Logs +logs/ +*.log + +# RAG System Specific +cache/ +chroma_data/ +.indexer.pid +.query_service.pid + +# uv +uv.lock + +# Output +output/ +chunks.json + +# Qdrant Storage +qdrant_storage/ + +# Modal +.modal/ + +# Testing +.pytest_cache/ +.coverage +htmlcov/ + +# Jupyter +.ipynb_checkpoints + +# OS +.DS_Store +Thumbs.db + +# Temporary files +*.tmp +*.bak +*.swp \ No newline at end of file diff --git a/submissions/team_2/chatbot/backend/README.md b/submissions/team_2/chatbot/backend/README.md new file mode 100644 index 0000000..e5db507 --- /dev/null +++ b/submissions/team_2/chatbot/backend/README.md @@ -0,0 +1,388 @@ +# MetaKGP RAG System + +Production-ready Retrieval Augmented Generation (RAG) system for MetaKGP wiki using ChromaDB, Modal, and FastAPI. + +## Quick Start + +### 1. Install uv (if not already installed) +```bash +curl -LsSf https://astral.sh/uv/install.sh | sh +``` + +### 2. Setup Project +```bash +# Create virtual environment +uv venv + +# Install dependencies +uv pip install -e . + +# Download spaCy model +source .venv/bin/activate # or `source venv/bin/activate` +python -m spacy download en_core_web_sm +``` + +### 3. Configure Environment +```bash +# The unified .env file is in the team_2 root directory +# Copy the example config +cd ../../ # Go to team_2 root +cp .env.example .env + +# Edit with your credentials +nano .env +``` + +Required variables (see `submissions/team_2/.env.example` for full list): +```bash +# Database +DB_HOST=localhost +DB_PORT=5432 +DB_NAME=metakgp_content +DB_USER=username +DB_PASSWORD=password +DB_SSLMODE=require + +# Modal Services (Embeddings only) +MODAL_URL=https://your-workspace--metakgp-embeddings-fastapi-app.modal.run + +# LLM API Keys (Required) +# Groq hosts Llama models for all LLM inference +GROQ_API_KEY=your_groq_api_key_here +``` + +**Note:** The `.env` file should be at `submissions/team_2/.env` (project root), not in the backend directory. + +**LLM Architecture:** +- Embeddings: Modal-hosted `all-MiniLM-L6-v2` +- LLM: Groq-hosted `Llama-4-Scout-17b-16e` (GoT reasoning + MoE verification) + +### 4. Deploy Modal Embedding Service +```bash +# First time setup +modal setup + +# Deploy +modal deploy src/utils/modal_embeddings.py + +# Copy the returned URL to MODAL_URL in .env +``` + +### 5. Start Everything +```bash +./start.sh +``` + +This starts: +- RAG Indexer (PostgreSQL β†’ Chunks β†’ Embeddings β†’ ChromaDB) +- FastAPI Query Service (Semantic search API) + +### 6. Stop Everything +```bash +./stop.sh +``` + +## Monitoring + +### View Logs +```bash +# Indexer logs +tail -f logs/indexer.log + +# Query service logs +tail -f logs/query_service.log + +# Both logs +tail -f logs/*.log +``` + +### Check Health +```bash +# Overall API health +curl http://localhost:8000/health + +# Query service health +curl http://localhost:8000/query/health +``` + +### API Documentation +- Swagger UI: http://localhost:8000/docs +- ReDoc: http://localhost:8000/redoc + +## API Usage + +### Search Request +```bash +curl -X POST http://localhost:8000/query/search \ + -H "Content-Type: application/json" \ + -d '{ + "query": "How do I register for courses?", + "top_k": 5 + }' +``` + +### Search with Filters +```bash +curl -X POST http://localhost:8000/query/search \ + -H "Content-Type: application/json" \ + -d '{ + "query": "hostel allocation process", + "top_k": 10, + "filters": { + "category": "Hostel", + "min_entity_count": 3 + } + }' +``` + +### Python Usage +```python +import requests + +response = requests.post('http://localhost:8000/query/search', json={ + 'query': 'What are the placement statistics?', + 'top_k': 5 +}) + +results = response.json() +for result in results['results']: + print(f"Score: {result['score']:.2f}") + print(f"Page: {result['metadata']['title']}") + print(f"Text: {result['text'][:200]}...") +``` + +## Architecture + +### Components + +1. **Modal Embedding Service** (`src/utils/modal_embeddings.py`) + - A100 GPU-accelerated sentence-transformers + - 384-dimensional embeddings (all-MiniLM-L6-v2) + - Auto-scaling on Modal with batch_size=500 + +2. **Embedding Client** (`src/utils/embedding_client.py`) + - HTTP client for Modal API + - Retry logic with exponential backoff + - Health checking + +3. **Chunk Processor** (`src/utils/chunk_processor.py`) + - Text chunking (512 words, 50 overlap) + - Entity extraction (spaCy) + - Metadata enrichment + +4. **ChromaDB Client** (`src/utils/chroma_client.py`) + - Persistent vector storage + - Cosine similarity search + - Metadata flattening + +5. **Indexing Service** (`src/services/indexing/indexer.py`) + - PostgreSQL streaming + - Embedding cache + - Batch processing + +6. **Query Service** (`src/services/query_service/`) + - **service.py**: Business logic for semantic search + - **router.py**: FastAPI routes (mounted at `/query`) + - Integrated into main FastAPI app + +7. **Main FastAPI App** (`src/app/main.py`) + - Unified API with all service routers + - Lifespan management for service initialization + - Extensible architecture for adding new services + +### Data Flow +``` +PostgreSQL β†’ Chunk Processor β†’ Embedding Client β†’ ChromaDB + ↓ + Modal Service (GPU) +``` + +## Configuration + +### Environment Variables (.env) + +```bash +# Database (individual variables) +DB_HOST=pg-15260559-arshgoyaldev-9af3.l.aivencloud.com +DB_PORT=22861 +DB_NAME=metakgp_content +DB_USER=avnadmin +DB_PASSWORD=your_password +DB_SSLMODE=require + +# Modal Embedding Service +MODAL_URL=https://your-workspace--metakgp-embeddings.modal.run + +# Storage +CHROMA_DIR=./chroma_data +CACHE_DIR=./cache + +# API Server +HOST=0.0.0.0 +PORT=8000 + +# Indexer +BATCH_SIZE=100 +``` + +### Database Schema + +The indexer expects a PostgreSQL `metakgp_pages` table: + +```sql +CREATE TABLE metakgp_pages ( + id SERIAL PRIMARY KEY, + name VARCHAR(500) UNIQUE, + title VARCHAR(500), + cleaned_text TEXT, + categories TEXT[], + links TEXT[], + exists BOOLEAN DEFAULT TRUE, + redirect BOOLEAN DEFAULT FALSE, + revision INTEGER +); +``` + +## Development + +### Add Dependencies +```bash +# Edit pyproject.toml to add dependency +# Then: +uv pip install -e . +``` + +### Run Individual Services + +#### Indexer Only +```bash +source .venv/bin/activate +python src/services/indexing/indexer.py \ + --chroma-dir ./chroma_data \ + --cache-dir ./cache +``` + +#### Query Service Only +```bash +source .venv/bin/activate +uvicorn src.app.main:app --reload +``` + +### Reset and Reindex +```bash +# Stop services +./stop.sh + +# Clear data +rm -rf cache/ chroma_data/ + +# Start fresh +./start.sh +``` + +## Performance + +- **Indexing Speed**: 50-100 pages/minute +- **Query Latency**: 1-2 seconds +- **Vector Dimensions**: 384 (all-MiniLM-L6-v2) +- **Storage**: ~54.6MB for 2,841 documents + +## Troubleshooting + +### Port Already in Use +```bash +# Find process on port 8000 +lsof -i :8000 + +# Kill it +kill -9 +``` + +### Services Won't Stop +```bash +# Force kill all +pkill -9 -f "indexer" +pkill -9 -f "uvicorn.*api" +``` + +### Import Errors +```bash +# Reinstall dependencies +uv pip install -e . +python -m spacy download en_core_web_sm +``` + +### Modal Service Issues +```bash +# Check health +curl https://YOUR_MODAL_URL/embedding/health + +# Check Modal dashboard +modal app list +``` + +### Database Connection +```bash +# Test connection +psql "$DATABASE_URL" -c "SELECT COUNT(*) FROM metakgp_pages;" +``` + +## Project Structure + +``` +backend/ +β”œβ”€β”€ src/ +β”‚ β”œβ”€β”€ app/ # Main FastAPI application +β”‚ β”‚ └── main.py # Unified API with all routers +β”‚ β”œβ”€β”€ services/ # Service layer +β”‚ β”‚ β”œβ”€β”€ indexing/ +β”‚ β”‚ β”‚ └── indexer.py # Indexing pipeline +β”‚ β”‚ └── query_service/ +β”‚ β”‚ β”œβ”€β”€ service.py # Business logic +β”‚ β”‚ └── router.py # FastAPI routes +β”‚ β”œβ”€β”€ utils/ # Shared utilities +β”‚ β”‚ β”œβ”€β”€ modal_embeddings.py # Modal deployment +β”‚ β”‚ β”œβ”€β”€ embedding_client.py # HTTP client +β”‚ β”‚ β”œβ”€β”€ chunk_processor.py # NLP processing +β”‚ β”‚ └── chroma_client.py # Vector storage +β”‚ └── routers/ # Additional routers (future use) +β”œβ”€β”€ cache/ # Embedding cache +β”œβ”€β”€ chroma_data/ # Vector database +β”œβ”€β”€ logs/ # Service logs +β”œβ”€β”€ pyproject.toml # Dependencies +β”œβ”€β”€ .env.example # Config template +β”œβ”€β”€ start.sh # Start script +β”œβ”€β”€ stop.sh # Stop script +└── ARCHITECTURE.md # Architecture docs +``` + +## Key Features + +- Modern Tooling - uv, pyproject.toml, uvicorn +- Fast Setup - Two commands to start +- Production-Ready - Retry logic, graceful shutdown +- Unified API - All services under one FastAPI app with routers +- Extensible - Easy to add new services via routers +- Scalable Architecture - Services, utilities separation +- Resumable - Offset tracking survives restarts +- GPU Accelerated - Modal A100 auto-scaling +- Advanced NLP - Entity extraction with spaCy + +## License + +MIT + +## Contributing + +Contributions welcome! Please ensure code passes linting: + +```bash +# Install dev dependencies +uv pip install -e ".[dev]" + +# Format code +black src/ + +# Lint +ruff check src/ +``` diff --git a/submissions/team_2/chatbot/backend/pyproject.toml b/submissions/team_2/chatbot/backend/pyproject.toml new file mode 100644 index 0000000..f2624e8 --- /dev/null +++ b/submissions/team_2/chatbot/backend/pyproject.toml @@ -0,0 +1,62 @@ +[project] +name = "metakgp-rag" +version = "1.0.0" +description = "MetaKGP RAG System with ChromaDB and Modal" +readme = "README.md" +requires-python = ">=3.9" +license = {text = "MIT"} +authors = [ + {name = "MetaKGP Team"} +] +keywords = ["rag", "metakgp", "chromadb", "modal", "fastapi"] + +dependencies = [ + "chromadb>=0.4.22", + "sqlalchemy>=2.0.25", + "fastapi>=0.109.0", + "uvicorn[standard]>=0.27.0", + "pydantic>=2.5.3", + "pydantic-settings>=2.1.0", + "requests>=2.31.0", + "httpx>=0.26.0", + "spacy>=3.7.2", + "en-core-web-sm @ https://github.com/explosion/spacy-models/releases/download/en_core_web_sm-3.8.0/en_core_web_sm-3.8.0-py3-none-any.whl", + "modal>=0.57.0", + "psycopg2-binary>=2.9.9", + "python-dotenv>=1.0.0", + "langgraph>=0.2.0", + "langchain-core>=0.3.0", + "groq>=0.4.0", + "networkx>=3.2", + "pyvis>=0.3.2", +] + +[project.optional-dependencies] +dev = [ + "pytest>=7.0.0", + "black>=23.0.0", + "ruff>=0.1.0", +] + +[project.scripts] +start-rag = "metakgp_rag.scripts:start" +stop-rag = "metakgp_rag.scripts:stop" + +[build-system] +requires = ["hatchling"] +build-backend = "hatchling.build" + +[tool.hatch.build.targets.wheel] +packages = ["src"] + +[tool.black] +line-length = 100 +target-version = ['py38', 'py39', 'py310', 'py311'] + +[tool.ruff] +line-length = 100 +target-version = "py38" + +[tool.ruff.lint] +select = ["E", "F", "I"] +ignore = ["E501"] diff --git a/submissions/team_2/chatbot/backend/src/__init__.py b/submissions/team_2/chatbot/backend/src/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/submissions/team_2/chatbot/backend/src/app/__init__.py b/submissions/team_2/chatbot/backend/src/app/__init__.py new file mode 100644 index 0000000..ce359a8 --- /dev/null +++ b/submissions/team_2/chatbot/backend/src/app/__init__.py @@ -0,0 +1,3 @@ +""" +MetaKGP App Package +""" diff --git a/submissions/team_2/chatbot/backend/src/app/main.py b/submissions/team_2/chatbot/backend/src/app/main.py new file mode 100644 index 0000000..8969874 --- /dev/null +++ b/submissions/team_2/chatbot/backend/src/app/main.py @@ -0,0 +1,161 @@ +""" +MetaKGP FastAPI Application +Main application with all service routers +""" + +from fastapi import FastAPI +from fastapi.middleware.cors import CORSMiddleware +from contextlib import asynccontextmanager +import logging +import os +from pathlib import Path +from dotenv import load_dotenv + +from src.services.query_service.router import router as query_router, set_query_service +from src.services.query_service.service import QueryService +from src.services.chat_agent.router import router as got_router, set_got_engine +from src.services.chat_agent.engine import SimplifiedGoTEngine + +# Load environment variables from team_2 root directory +env_path = Path(__file__).resolve().parents[4] / '.env' # Go up to team_2/ +if env_path.exists(): + load_dotenv(env_path) + logging.info(f"Loaded .env from {env_path}") +else: + # Fallback to default behavior (search up the directory tree) + load_dotenv() + logging.warning(f".env not found at {env_path}, using default load_dotenv()") + +# Setup logging +logging.basicConfig( + level=logging.INFO, + format='%(asctime)s - %(name)s - %(levelname)s - %(message)s', + force=True +) +logger = logging.getLogger(__name__) + + +@asynccontextmanager +async def lifespan(app: FastAPI): + """ + Application lifespan manager + Handles startup and shutdown events + """ + # Startup + logger.info("Starting MetaKGP API...") + + # Get configuration from environment + modal_url = os.getenv("MODAL_URL") + if not modal_url: + raise RuntimeError("MODAL_URL environment variable not set") + + groq_api_key = os.getenv("GROQ_API_KEY") + if not groq_api_key: + raise RuntimeError("GROQ_API_KEY environment variable not set") + + chroma_dir = os.getenv("CHROMA_DIR", "./chroma_data") + + # Initialize query service + logger.info("Initializing Query Service...") + query_service = QueryService( + modal_url=modal_url, + chroma_dir=chroma_dir, + collection_name="metakgp_wiki" + ) + set_query_service(query_service) + + # Initialize simplified GoT engine with Llama-3.3-70b + logger.info("Initializing Simplified Graph of Thought Engine with Llama-3.3-70b...") + got_engine = SimplifiedGoTEngine( + modal_url=modal_url, + groq_api_key=groq_api_key, + query_api_url="http://localhost:8000/query/search", + top_k=30 # Retrieve 30 chunks for comprehensive analysis + ) + set_got_engine(got_engine) + + logger.info("All services initialized successfully") + logger.info(f"Total documents: {query_service.get_document_count()}") + logger.info(f"GoT Engine: Llama-3.3-70b with top_k={got_engine.top_k}") + + yield + + # Shutdown + logger.info("Shutting down MetaKGP API...") + + +# Create FastAPI app +app = FastAPI( + title="MetaKGP Chatbot API", + description="API for MetaKGP WIKI Chatbot", + version="2.0.0", + lifespan=lifespan +) + +# Configure CORS +app.add_middleware( + CORSMiddleware, + allow_origins=["*"], # In production, replace with specific origins like ["http://localhost:5173"] + allow_credentials=True, + allow_methods=["*"], + allow_headers=["*"], +) + + +# Include routers +app.include_router(query_router) +app.include_router(got_router) + + +@app.get("/") +async def root(): + """Root endpoint with API info""" + return { + "name": "MetaKGP Chatbot API", + "version": "2.0.0", + "services": { + "query": { + "description": "Semantic search over MetaKGP wiki", + "endpoints": { + "search": "/query/search (POST)", + "health": "/query/health (GET)" + } + }, + "got": { + "description": "Graph of Thought reasoning service", + "endpoints": { + "query": "/got/query (POST)", + "status": "/got/graph-status (GET)", + "health": "/got/health (GET)" + } + } + }, + "documentation": { + "swagger": "/docs", + "redoc": "/redoc" + } + } + + +@app.get("/health") +async def health(): + """Overall API health check""" + return { + "status": "ok", + "services": ["query"] + } + + +if __name__ == "__main__": + import uvicorn + + port = int(os.getenv("PORT", "8000")) + host = os.getenv("HOST", "0.0.0.0") + + uvicorn.run( + "src.app.main:app", + host=host, + port=port, + log_level="info", + reload=False + ) diff --git a/submissions/team_2/chatbot/backend/src/routers/__init__.py b/submissions/team_2/chatbot/backend/src/routers/__init__.py new file mode 100644 index 0000000..bf6ba02 --- /dev/null +++ b/submissions/team_2/chatbot/backend/src/routers/__init__.py @@ -0,0 +1 @@ +"""Routers package""" diff --git a/submissions/team_2/chatbot/backend/src/routers/health.py b/submissions/team_2/chatbot/backend/src/routers/health.py new file mode 100644 index 0000000..e69de29 diff --git a/submissions/team_2/chatbot/backend/src/services/__init__.py b/submissions/team_2/chatbot/backend/src/services/__init__.py new file mode 100644 index 0000000..0d989f4 --- /dev/null +++ b/submissions/team_2/chatbot/backend/src/services/__init__.py @@ -0,0 +1 @@ +"""Services package for MetaKGP RAG system""" diff --git a/submissions/team_2/chatbot/backend/src/services/chat_agent/__init__.py b/submissions/team_2/chatbot/backend/src/services/chat_agent/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/submissions/team_2/chatbot/backend/src/services/chat_agent/engine.py b/submissions/team_2/chatbot/backend/src/services/chat_agent/engine.py new file mode 100644 index 0000000..2fa935d --- /dev/null +++ b/submissions/team_2/chatbot/backend/src/services/chat_agent/engine.py @@ -0,0 +1,868 @@ +""" +Simplified Graph of Thought (GoT) Engine with Llama-3.3-70b +Architecture: Query -> Iterative RAG (up to 3 calls) -> GoT on chunks -> Single MoE round -> Final answer +""" + +import logging +import asyncio +from typing import Dict, List, Optional +import json +import httpx + +from src.services.chat_agent.experts import MoEGauntlet, strip_markdown_json +from src.utils.embedding_client import ModalEmbeddingClient +from src.utils.groq_client import GroqClient + +logger = logging.getLogger(__name__) + + +class SimplifiedGoTEngine: + """ + Simplified Graph of Thought reasoning engine with Iterative RAG + Process: Query -> Iterative RAG (max 3 queries) -> Analyze chunks -> Single MoE verification -> Final answer + """ + + def __init__( + self, + modal_url: str, + groq_api_key: str, + query_api_url: str = "http://localhost:8000/query/search", + top_k: int = 30 + ): + """ + Initialize the simplified GoT Engine + + Args: + modal_url: Modal embedding service URL + groq_api_key: Groq API key + query_api_url: URL for the RAG query API + top_k: Number of chunks to retrieve from RAG + """ + self.modal_url = modal_url + self.groq_api_key = groq_api_key + self.query_api_url = query_api_url + self.top_k = top_k + + # Initialize components + self.embedding_client = ModalEmbeddingClient(modal_url) + self.groq_client = GroqClient(groq_api_key) + self.moe_gauntlet = MoEGauntlet(self.groq_client) + + # HTTP client for API calls + self.http_client = httpx.AsyncClient(timeout=60.0) + + logger.info(f"SimplifiedGoTEngine initialized with top_k={top_k}") + + async def query_rag(self, query: str, top_k: Optional[int] = None, filters: Optional[Dict] = None) -> Dict: + """ + Query the RAG API to get relevant context with optional metadata filters + + Args: + query: Search query + top_k: Number of results to retrieve (uses self.top_k if not specified) + filters: Optional metadata filters (e.g., {"source_page": "Page Name"}) + + Returns: + Dict with results + """ + try: + k = top_k if top_k is not None else self.top_k + payload = {"query": query, "top_k": k} + + if filters: + payload["filters"] = filters + logger.info(f"Querying RAG: '{query}' with top_k={k}, filters={filters}") + else: + logger.info(f"Querying RAG: '{query}' with top_k={k}") + + response = await self.http_client.post( + self.query_api_url, + json=payload + ) + response.raise_for_status() + return response.json() + + except Exception as e: + logger.error(f"RAG query failed: {e}") + return {"results": [], "error": str(e)} + + async def extract_query_entities(self, query: str) -> Dict: + """ + Extract entities, keywords, and metadata from the query for targeted RAG searches + + Args: + query: User query + + Returns: + Dict with extracted entities, expanded terms, and suggested source pages + """ + prompt = f"""You are a query analyzer for IIT Kharagpur MetaKGP wiki. Extract entities and keywords for targeted search. + +User Query: {query} + +ACRONYM DICTIONARY (expand these): +- TFPS β†’ Technology Film and Photography Society +- TLS β†’ Technology Literary Society +- TSG β†’ Technology Students' Gymkhana +- Gymkhana β†’ Technology Students' Gymkhana +- RP / RP Hall β†’ Rajendra Prasad Hall of Residence +- RK / RK Hall β†’ Radha Krishnan Hall of Residence +- Nehru / Nehru Hall β†’ Nehru Hall of Residence +- Azad / Azad Hall β†’ Azad Hall of Residence +- Patel / Patel Hall β†’ Patel Hall of Residence +- MS β†’ Megnad Saha Hall of Residence +- LLR β†’ Lala Lajpat Rai Hall of Residence +- LBS β†’ Lal Bahadur Shastri Hall of Residence +- MMM β†’ Madan Mohan Malaviya Hall of Residence +- BC Roy β†’ BC Roy Technology Hospital +- SN/IG / SNIG β†’ Sarojini Naidu / Indira Gandhi Hall of Residence +- MT β†’ Mother Teresa Hall of Residence +- SNVH β†’ Sister Nivedita Hall of Residence +- HMC β†’ Hall Management Centre +- VP β†’ Vice President +- GSec β†’ General Secretary +- Sec β†’ Secretary +- SWG β†’ Student Welfare Group +- GC β†’ General Championship +- GC Tech β†’ General Championship (Technology) +- GC SocCult β†’ General Championship (Social and Cultural) +- Inter IIT β†’ Inter IIT Cultural Meet / Inter IIT Sports Meet / Inter IIT Tech Meet +- KGP β†’ Kharagpur / IIT Kharagpur + +TASK: +1. Identify main entities (people, places, organizations, events) +2. Expand any acronyms found +3. Extract key concepts (e.g., "alumni", "events", "members") +4. Suggest likely source page names from the wiki (be specific with full names) +5. Generate focused search keywords + +Example: +Query: "alumni of RK hall" +Output: +{{ + "entities": ["Radha Krishnan Hall of Residence"], + "expanded_acronyms": {{"RK hall": "Radha Krishnan Hall of Residence"}}, + "key_concepts": ["alumni", "notable people", "students"], + "suggested_source_pages": ["Radha Krishnan Hall of Residence"], + "focused_keywords": ["alumni", "students", "notable", "residents"] +}} + +CRITICAL: Return ONLY a valid JSON object, nothing else. + +Output format: +{{ + "entities": ["list of main entities with full expanded names"], + "expanded_acronyms": {{"acronym": "full name"}}, + "key_concepts": ["main concepts from query"], + "suggested_source_pages": ["likely wiki page names"], + "focused_keywords": ["specific search terms"] +}}""" + + try: + response = await self.call_llm(prompt, max_tokens=512) + + if not response or not response.strip(): + logger.warning("Empty response from entity extraction") + return self._fallback_entity_extraction(query) + + from src.services.chat_agent.experts import strip_markdown_json + cleaned_response = strip_markdown_json(response) + + if not cleaned_response: + logger.warning("No valid JSON in entity extraction response") + return self._fallback_entity_extraction(query) + + result = json.loads(cleaned_response) + logger.info(f"Extracted entities: {result.get('entities', [])}") + logger.info(f"Suggested source pages: {result.get('suggested_source_pages', [])}") + return result + + except Exception as e: + logger.error(f"Entity extraction failed: {e}") + return self._fallback_entity_extraction(query) + + def _fallback_entity_extraction(self, query: str) -> Dict: + """Fallback entity extraction using simple heuristics""" + # Simple acronym expansion + acronym_map = { + "TFPS": "Technology Film and Photography Society", + "TLS": "Technology Literary Society", + "TSG": "Technology Students' Gymkhana", + "Gymkhana": "Technology Students' Gymkhana", + "RP Hall": "Rajendra Prasad Hall of Residence", + "RP": "Rajendra Prasad Hall of Residence", + "RK Hall": "Radha Krishnan Hall of Residence", + "RK": "Radha Krishnan Hall of Residence", + "Nehru Hall": "Nehru Hall of Residence", + "Nehru": "Nehru Hall of Residence", + "Azad Hall": "Azad Hall of Residence", + "Azad": "Azad Hall of Residence", + "Patel Hall": "Patel Hall of Residence", + "Patel": "Patel Hall of Residence", + "MS": "Megnad Saha Hall of Residence", + "LLR": "Lala Lajpat Rai Hall of Residence", + "LBS": "Lal Bahadur Shastri Hall of Residence", + "MMM": "Madan Mohan Malaviya Hall of Residence", + "BC Roy": "BC Roy Technology Hospital", + "SNIG": "Sarojini Naidu / Indira Gandhi Hall of Residence", + "SN/IG": "Sarojini Naidu / Indira Gandhi Hall of Residence", + "MT": "Mother Teresa Hall of Residence", + "SNVH": "Sister Nivedita Hall of Residence", + "HMC": "Hall Management Centre", + "VP": "Vice President", + "GSec": "General Secretary", + "Sec": "Secretary", + "SWG": "Student Welfare Group", + "GC": "General Championship", + "GC Tech": "General Championship (Technology)", + "GC SocCult": "General Championship (Social and Cultural)", + "Inter IIT": "Inter IIT Meet", + "KGP": "IIT Kharagpur" + } + + expanded_acronyms = {} + entities = [] + + for acronym, full_name in acronym_map.items(): + if acronym.lower() in query.lower(): + expanded_acronyms[acronym] = full_name + entities.append(full_name) + + # Extract simple keywords + stop_words = {'the', 'a', 'an', 'is', 'are', 'was', 'were', 'of', 'in', 'at', 'to', 'for', 'on', 'with', 'by', 'from'} + keywords = [word.strip('?.,!') for word in query.lower().split() if word not in stop_words and len(word) > 2] + + return { + "entities": entities, + "expanded_acronyms": expanded_acronyms, + "key_concepts": keywords[:3], + "suggested_source_pages": entities, + "focused_keywords": keywords[:5] + } + + async def generate_targeted_rag_queries(self, query: str, entity_info: Dict) -> List[Dict]: + """ + Generate multiple targeted RAG queries with different strategies + + Args: + query: Original user query + entity_info: Extracted entity information + + Returns: + List of query configs: [{"query": str, "top_k": int, "filters": dict, "strategy": str}] + """ + queries = [] + + # Strategy 1: Filtered search on specific source pages with key concepts + suggested_pages = entity_info.get("suggested_source_pages", []) + key_concepts = entity_info.get("key_concepts", []) + + for page in suggested_pages[:2]: # Top 2 suggested pages + for concept in key_concepts[:2]: # Top 2 concepts + queries.append({ + "query": concept, + "top_k": 10, + "filters": {"source_page": page}, + "strategy": f"Filtered: {concept} in {page}" + }) + + # Strategy 2: Broad search with full entity names + entities = entity_info.get("entities", []) + for entity in entities[:2]: + queries.append({ + "query": entity, + "top_k": 15, + "filters": None, + "strategy": f"Broad entity: {entity}" + }) + + # Strategy 3: Original query (unfiltered) + queries.append({ + "query": query, + "top_k": 20, + "filters": None, + "strategy": "Original query" + }) + + # Strategy 4: Focused keywords without filters (discovery mode) + focused_keywords = entity_info.get("focused_keywords", []) + if focused_keywords: + keyword_query = " ".join(focused_keywords[:3]) + queries.append({ + "query": keyword_query, + "top_k": 15, + "filters": None, + "strategy": f"Keywords: {keyword_query}" + }) + + logger.info(f"Generated {len(queries)} targeted RAG queries with different strategies") + return queries + + async def call_llm(self, prompt: str, max_tokens: int = 3072) -> str: + """ + Call Llama-3.3-70b model + + Args: + prompt: Prompt text + max_tokens: Maximum tokens + + Returns: + LLM response + """ + try: + return await self.groq_client.generate_judge(prompt, max_tokens=max_tokens) + except Exception as e: + logger.error(f"LLM call failed: {e}") + return "" + + async def check_relevance(self, query: str) -> Dict: + """ + Check if the query is related to IIT Kharagpur/MetaKGP + Also expands common acronyms before processing + + Returns: + { + "is_relevant": bool, + "reasoning": str + } + """ + prompt = f"""You are a relevance checker and Query Planner for MetaKGP Wiki (IIT Kharagpur information system). + +Question: {query} + +STEP 1: ACRONYM EXPANSION +The database does NOT understand acronyms. You MUST expand them using this dictionary: +- TFPS -> Technology Film and Photography Society +- TLS -> Technology Literary Society +- TSG -> Technology Students' Gymkhana +- Gymkhana -> Technology Students' Gymkhana +- RP / RP Hall -> Rajendra Prasad Hall of Residence +- RK / RK Hall -> Radha Krishnan Hall of Residence +- Nehru / Nehru Hall -> Nehru Hall of Residence +- Azad / Azad Hall -> Azad Hall of Residence +- Patel / Patel Hall -> Patel Hall of Residence +- MS -> Megnad Saha Hall of Residence +- LLR -> Lala Lajpat Rai Hall of Residence +- LBS -> Lal Bahadur Shastri Hall of Residence +- MMM -> Madan Mohan Malaviya Hall of Residence +- BC Roy -> BC Roy Technology Hospital +- SN/IG / SNIG -> Sarojini Naidu / Indira Gandhi Hall of Residence +- MT -> Mother Teresa Hall of Residence +- SNVH -> Sister Nivedita Hall of Residence +- HMC -> Hall Management Centre +- VP -> Vice President +- GSec -> General Secretary +- Sec -> Secretary +- SWG -> Student Welfare Group +- GC -> General Championship +- GC Tech -> General Championship (Technology) +- GC SocCult -> General Championship (Social and Cultural) +- Inter IIT -> Inter IIT Cultural Meet / Inter IIT Sports Meet / Inter IIT Tech Meet +- KGP -> Kharagpur / IIT Kharagpur + + +STEP 2: RELEVANCE CHECK +Determine if this question could be related to IIT Kharagpur. Be LENIENT: +- Academic terms (GPA, CGPA, grades like "2.2", courses, departments) +- Campus life, facilities, hostels, societies, events +- Abbreviations or short queries that might refer to IIT KGP specific things +- Technical terms that could relate to academics or campus activities + +Only mark as irrelevant if it's CLEARLY about something else (e.g., "weather in New York", "recipe for pasta", "movie recommendations"). + +When in doubt, mark as RELEVANT. Short or ambiguous queries should be marked RELEVANT. + +CRITICAL: Return ONLY a valid JSON object, nothing else. No explanations, no markdown formatting. + +Output format: +{{ + "is_relevant": true/false, + "expanded_query": "query with acronyms expanded", + "reasoning": "brief explanation" +}}""" + + response = await self.call_llm(prompt, max_tokens=256) + + try: + cleaned_response = strip_markdown_json(response) + result = json.loads(cleaned_response) + return result + except: + # Default to relevant if parsing fails + return {"is_relevant": True, "reasoning": "Parse error, proceeding"} + + async def generate_followup_queries(self, original_query: str, initial_chunks: List[Dict]) -> List[str]: + """ + Generate follow-up queries based on initial RAG results + Uses multi-path reasoning approach to create diverse queries + + Args: + original_query: The original user query + initial_chunks: Chunks retrieved from the first RAG call + + Returns: + List of follow-up queries (max 2) + """ + # Analyze initial chunks to understand what information we got + chunk_summaries = [] + for i, chunk in enumerate(initial_chunks[:5], 1): + chunk_summaries.append(f"{i}. {chunk['metadata']['title']}: {chunk['text'][:150]}...") + + chunks_preview = "\n".join(chunk_summaries) + + prompt = f"""You are analyzing search results to generate follow-up queries for deeper information gathering using multi-path reasoning. + +Original Query: {original_query} + +Initial Retrieved Information (top 5 chunks): +{chunks_preview} + +TASK: Generate 1-2 follow-up queries using DIFFERENT REASONING PATHS: + +PATH 1 - Direct Expansion: +- If the original query is about events/activities, query for specific event names or details mentioned +- If it's about a society/club, query for specific initiatives, projects, or achievements + +PATH 2 - Temporal Context: +- If asking about current/recent information, focus on 2024-2025 timeframe +- Query for historical context if relevant + +PATH 3 - Related Entities: +- Extract key entities (people, places, organizations) from chunks +- Generate queries about relationships between these entities + +Guidelines: +1. Make queries MORE SPECIFIC than the original +2. Use information from the chunks to create targeted queries +3. Each follow-up should explore a DIFFERENT angle (temporal, specific details, relationships) +4. If chunks already provide comprehensive info, return empty list [] +5. Example transformations: + - "events by SWG" β†’ Path 1: "SWG Hacknight details", Path 2: "SWG workshops 2025" + - "Who is VP of TFPS?" β†’ Path 1: "Vice President Technology Film and Photography Society", Path 2: "TFPS leadership team 2025" + +CRITICAL: Return ONLY a valid JSON object, nothing else. No explanations, no markdown formatting. + +Output format: +{{ + "followup_queries": ["query 1", "query 2"], + "reasoning_paths": ["path type for query 1", "path type for query 2"], + "reasoning": "why these queries help" +}}""" + + try: + response = await self.call_llm(prompt, max_tokens=512) + + # Check if response is empty + if not response or not response.strip(): + logger.warning("Empty response from LLM for follow-up query generation") + return [] + + # Strip markdown and extract JSON + cleaned_response = strip_markdown_json(response) + + if not cleaned_response: + logger.warning("No valid JSON found in follow-up query response") + return [] + + result = json.loads(cleaned_response) + queries = result.get("followup_queries", []) + + # Validate queries are strings + if not isinstance(queries, list): + logger.warning(f"Invalid followup_queries format: {type(queries)}") + return [] + + # Filter to only string queries + valid_queries = [q for q in queries if isinstance(q, str) and q.strip()] + + logger.info(f"Generated {len(valid_queries)} follow-up queries: {valid_queries}") + return valid_queries[:2] # Max 2 follow-up queries + + except json.JSONDecodeError as e: + logger.error(f"Failed to parse follow-up queries JSON: {e}") + logger.error(f"Raw response: {response[:500]}") + return [] + except Exception as e: + logger.error(f"Failed to generate follow-up queries: {e}") + return [] + + async def iterative_rag_retrieval(self, query: str, max_iterations: int = 3) -> Dict: + """ + Perform iterative RAG retrieval with intelligent multi-strategy queries + Now uses entity extraction and metadata filtering for targeted searches + + Args: + query: Original user query + max_iterations: Maximum number of RAG query strategies to execute (default 3) + + Returns: + Dict with aggregated results from all iterations + """ + all_chunks = [] + seen_texts = set() # Avoid duplicates + queries_made = [] + + # Step 1: Extract entities and generate targeted queries + logger.info(f"Extracting entities from query: {query}") + entity_info = await self.extract_query_entities(query) + + # Step 2: Generate multiple targeted RAG queries with different strategies + targeted_queries = await self.generate_targeted_rag_queries(query, entity_info) + + # Step 3: Execute targeted queries (up to max_iterations) + for i, query_config in enumerate(targeted_queries[:max_iterations * 2], start=1): + strategy = query_config["strategy"] + rag_query = query_config["query"] + top_k = query_config["top_k"] + filters = query_config.get("filters") + + logger.info(f"Iteration {i}: Strategy: {strategy}") + + queries_made.append({ + "query": rag_query, + "strategy": strategy, + "filters": filters + }) + + # Execute RAG query with filters + rag_results = await self.query_rag(rag_query, top_k=top_k, filters=filters) + + if rag_results.get("results"): + new_chunks_count = 0 + for chunk in rag_results["results"]: + chunk_text = chunk["text"] + if chunk_text not in seen_texts: + # Tag chunk with the strategy that found it + chunk["discovery_strategy"] = strategy + all_chunks.append(chunk) + seen_texts.add(chunk_text) + new_chunks_count += 1 + + logger.info(f"Iteration {i}: Added {new_chunks_count} new unique chunks (Total: {len(all_chunks)})") + else: + logger.info(f"Iteration {i}: No results from this strategy") + + # Early stopping if we have enough diverse chunks + if len(all_chunks) >= 50: + logger.info(f"Early stopping: Collected {len(all_chunks)} chunks (target reached)") + break + + # Sort all chunks by score (highest first) + all_chunks.sort(key=lambda x: x["score"], reverse=True) + + logger.info(f"Multi-strategy RAG complete: {len(queries_made)} queries executed, {len(all_chunks)} total unique chunks") + + return { + "results": all_chunks, + "queries_made": queries_made, + "total_chunks": len(all_chunks), + "entity_info": entity_info, + "error": None + } + + async def analyze_chunks(self, query: str, chunks: List[Dict]) -> str: + """ + Analyze retrieved chunks using Graph of Thought reasoning with multiple reasoning paths + + Args: + query: Original query + chunks: List of retrieved chunks from RAG + + Returns: + Analyzed thought/summary from chunks with multiple reasoning paths explored + """ + # Use top 20 chunks since we now have more diverse data from iterative retrieval + # This gives us ~10k tokens, still within limits + chunks_to_analyze = chunks[:20] + + # Format chunks with source information + chunks_text = "" + for i, chunk in enumerate(chunks_to_analyze, 1): + chunks_text += f"\n--- Chunk {i} (Score: {chunk['score']:.3f}) ---\n" + chunks_text += f"Title: {chunk['metadata']['title']}\n" + chunks_text += f"Source: {chunk['metadata']['source_page']}\n" + chunks_text += f"Text: {chunk['text']}\n" + + prompt = f"""You are analyzing information from the MetaKGP wiki using Graph of Thought (GoT) reasoning with multiple paths. + +Sub-Question: {query} + +Retrieved Context (top 20 most relevant chunks from iterative search): +{chunks_text} + +MULTI-PATH REASONING APPROACH: + +PATH 1 - Direct Answer from Primary Context: +- Based ONLY on the highest-scoring chunks (1-3), answer the sub-question directly +- Be specific and cite what the source says +- Format: "According to [Source], [fact]" + +PATH 2 - Synthesized Answer from Multiple Contexts: +- Combine information from multiple chunks (1-10) +- Mention which sources support each claim +- Look for consensus across different sources +- Format: "Multiple sources indicate: [fact] (Sources: [list])" + +PATH 3 - Temporal-Aware Answer (if applicable): +- If asking about current/recent information (2024-2025), extract ONLY claims marked as current +- Flag any outdated information clearly +- Prioritize most recent data +- Format: "As of 2025, [fact] (Source: [name])" + +INSTRUCTIONS: +1. Generate answers using ALL THREE PATHS where applicable +2. For each path, extract specific facts, names, dates, numbers, and details +3. Clearly label which path each piece of information comes from +4. If a path is not applicable (e.g., no temporal data needed), skip it +5. If information is insufficient for any path, state it clearly +6. Prioritize chunks with higher scores as they are more relevant +7. ALWAYS cite your sources using the format: (Source: [title/page]) + +Provide a comprehensive multi-path analysis:""" + + logger.info(f"Analyzing top {len(chunks_to_analyze)} chunks with LLM") + return await self.call_llm(prompt, max_tokens=2048) + + async def generate_final_answer(self, query: str, analysis: str, verification_result: Dict) -> str: + """ + Generate the final answer based on multi-path analysis and expert verification + + Args: + query: Original query + analysis: Multi-path analysis from chunks + verification_result: MoE verification result + + Returns: + Final answer with source citations + """ + verification_remarks = verification_result.get("remarks", "") + passed = verification_result.get("passed", False) + final_score = verification_result.get("final_score", 0.0) + + # Extract expert verdicts for transparency + expert_results = verification_result.get("expert_results", {}) + source_matcher_verdict = expert_results.get("source_matcher", {}).get("passed", False) + hallucination_verdict = expert_results.get("hallucination_hunter", {}).get("passed", False) + logic_verdict = expert_results.get("logic_expert", {}).get("passed", False) + + prompt = f"""You are synthesizing verified information from the MetaKGP wiki (IIT Kharagpur) using Graph of Thought reasoning. + +Original Question: {query} + +Multi-Path Analysis from Wiki: +{analysis} + +Expert Verification Status (MoE Gauntlet): +- Overall Verdict: {"βœ“ VERIFIED" if passed else "⚠ NEEDS REVIEW"} +- Final Confidence Score: {final_score:.2f}/1.0 +- Source Matcher: {"βœ“ PASS" if source_matcher_verdict else "βœ— FAIL"} +- Hallucination Hunter: {"βœ“ PASS (No hallucinations)" if hallucination_verdict else "βœ— FAIL (Hallucinations detected)"} +- Logic Expert: {"βœ“ PASS" if logic_verdict else "βœ— FAIL"} +- Verification Notes: {verification_remarks} + +SYNTHESIS INSTRUCTIONS: +1. Construct a SHORT, DIRECT answer using ONLY verified facts from the analysis +2. If multiple reasoning paths provided the same information, mention the consensus +3. Maintain source citations throughout in format: (Source: [page/title]) +4. Prioritize information that passed expert verification +5. If verification flagged issues, be cautious and qualify your statements +6. ONLY include information that directly answers the question asked +7. If information is not available, state this clearly and concisely - do NOT provide tangentially related information +8. Avoid unnecessary background or historical context unless specifically asked +9. Be conversational but concise (2-3 sentences for simple queries) +10. Do not make up information not present in the analysis + +Example formats: +- "According to the MetaKGP wiki, [fact] (Source: [page]). This is confirmed by [another source]." +- "Multiple sources indicate that [fact] (Sources: [page1], [page2])." +- "As of 2025, [current fact] (Source: [page])." +- "I don't have enough information to answer this question. The MetaKGP wiki does not contain details about [topic]." + +Final Answer:""" + + logger.info("Generating final answer") + return await self.call_llm(prompt, max_tokens=2048) + + async def process_query(self, query: str) -> Dict: + """ + Process a query through the simplified pipeline with iterative RAG + + Pipeline: + 1. Iterative RAG retrieval (up to 3 queries for comprehensive data gathering) + 2. Check if query is relevant based on retrieved chunks + 3. Analyze chunks with Graph of Thought reasoning + 4. Run single MoE verification round + 5. Generate final answer + + Args: + query: User query + + Returns: + Dict with final answer and metadata + """ + logger.info(f"Processing query: {query}") + + try: + # Step 1: Query RAG with iterative retrieval first + logger.info("Step 1: Querying RAG with iterative retrieval (max 3 iterations)") + rag_results = await self.iterative_rag_retrieval(query, max_iterations=3) + + if not rag_results.get("results"): + logger.warning("No chunks retrieved from RAG") + # Only check relevance if we got no chunks + relevance = await self.check_relevance(query) + + if not relevance.get("is_relevant", True): + logger.info(f"Query not relevant: {relevance.get('reasoning')}") + return { + "query": query, + "answer": "This question is not related to IIT Kharagpur or MetaKGP. I can only answer questions about IIT Kharagpur campus, academics, facilities, events, societies, and related information from the MetaKGP wiki.", + "confidence": 0.0, + "chunks_retrieved": 0, + "verification_passed": False, + "reasoning": relevance.get("reasoning"), + "error": None + } + + # If relevant but no chunks found + return { + "query": query, + "answer": "I don't have enough information in the MetaKGP wiki to answer this question. The wiki may not contain details about this topic yet.", + "confidence": 0.0, + "chunks_retrieved": 0, + "verification_passed": False, + "reasoning": "No relevant chunks found", + "error": rag_results.get("error") + } + + # If we got chunks, skip relevance check (chunks prove relevance) + chunks = rag_results["results"] + queries_made = rag_results.get("queries_made", []) + entity_info = rag_results.get("entity_info", {}) + + logger.info(f"Retrieved {len(chunks)} unique chunks from {len(queries_made)} multi-strategy queries") + + # Log entity extraction results + if entity_info: + logger.info(f"Detected entities: {entity_info.get('entities', [])}") + logger.info(f"Expanded acronyms: {entity_info.get('expanded_acronyms', {})}") + + # Format context for MoE + context_text = "\n\n".join([ + f"[{i+1}] {chunk['text']}" + for i, chunk in enumerate(chunks[:10]) # Use top 10 for context + ]) + + # Step 2: Analyze chunks with GoT reasoning + logger.info("Step 2: Analyzing chunks") + analysis = await self.analyze_chunks(query, chunks) + + if not analysis: + logger.error("Failed to analyze chunks") + return { + "query": query, + "answer": "I encountered an error while analyzing the information. Please try again.", + "confidence": 0.0, + "chunks_retrieved": len(chunks), + "verification_passed": False, + "reasoning": "Analysis failed", + "error": "LLM analysis failed" + } + + # Step 3: Single MoE verification round + logger.info("Step 3: Running MoE verification") + verification = await self.moe_gauntlet.verify_thought( + thought=analysis, + context=context_text, + graph_history=[] # No graph history in simplified version + ) + + logger.info(f"Verification result: {verification['remarks']}") + + # Step 4: Generate final answer + logger.info("Step 4: Generating final answer") + final_answer = await self.generate_final_answer(query, analysis, verification) + + if not final_answer: + logger.error("Failed to generate final answer") + return { + "query": query, + "answer": "I encountered an error while generating the answer. Please try again.", + "confidence": 0.0, + "chunks_retrieved": len(chunks), + "verification_passed": verification["passed"], + "reasoning": verification["remarks"], + "error": "Final answer generation failed" + } + + # Return result + confidence = verification["final_score"] * 0.9 # Scale down slightly + + # Extract query keywords (remove common words) + query_lower = query.lower() + common_words = {'the', 'a', 'an', 'is', 'are', 'was', 'were', 'of', 'in', 'at', 'to', 'for', 'on', 'with', 'by', 'from', 'what', 'who', 'where', 'when', 'how', 'why', 'which'} + query_keywords = [word for word in query_lower.split() if word not in common_words and len(word) > 2] + + # Filter chunks that have query keywords in source_page or title + relevant_chunks = [] + for chunk in chunks: + source_page = chunk["metadata"]["source_page"].lower() + title = chunk["metadata"]["title"].lower() + + # Check if any query keyword appears in source_page or title + has_keyword = any(keyword in source_page or keyword in title for keyword in query_keywords) + + if has_keyword: + relevant_chunks.append(chunk) + + # If we have relevant chunks, use them; otherwise fall back to top chunks + chunks_for_sources = relevant_chunks if relevant_chunks else chunks + + # Extract unique sources, prioritizing those with query keywords + unique_sources = [] + seen_sources = set() + for chunk in chunks_for_sources[:15]: # Check top 15 relevant chunks + source = chunk["metadata"]["source_page"] + if source not in seen_sources and len(unique_sources) < 5: + unique_sources.append({ + "page": source, + "score": chunk["score"] + }) + seen_sources.add(source) + + # Format sources as list of page names + source_pages = [s["page"] for s in unique_sources] + + # Extract query strategies used (for debugging) + strategies_used = list(set([q.get("strategy", "Unknown") for q in queries_made])) + + return { + "query": query, + "answer": final_answer, + "confidence": confidence, + "chunks_retrieved": len(chunks), + "queries_made": queries_made, + "strategies_used": strategies_used, + "entity_info": entity_info, + "verification_passed": verification["passed"], + "verification_score": verification["final_score"], + "reasoning": verification["remarks"], + "sources": source_pages, + "error": None + } + + except Exception as e: + logger.error(f"Query processing failed: {e}", exc_info=True) + return { + "query": query, + "answer": "I encountered an error while processing your question. Please try again.", + "confidence": 0.0, + "chunks_retrieved": 0, + "verification_passed": False, + "reasoning": str(e), + "error": str(e) + } + + async def close(self): + """Close HTTP client""" + await self.http_client.aclose() diff --git a/submissions/team_2/chatbot/backend/src/services/chat_agent/experts.py b/submissions/team_2/chatbot/backend/src/services/chat_agent/experts.py new file mode 100644 index 0000000..842f642 --- /dev/null +++ b/submissions/team_2/chatbot/backend/src/services/chat_agent/experts.py @@ -0,0 +1,585 @@ +""" +Mixture of Experts (MoE) for Graph of Thought Verification +Three specialized experts: Hallucination Hunter, Source Matcher, Logic Expert +Uses Groq models +""" + +import logging +from typing import Dict, List, Tuple +import asyncio +import re + +from src.utils.groq_client import GroqClient + +logger = logging.getLogger(__name__) + + +def strip_markdown_json(text: str) -> str: + """Remove markdown code fences from LLM responses and extract JSON. + + Handles formats like: + ```json + {...} + ``` + or + ``` + {...} + ``` + Also handles cases where there's extra text before or after the JSON. + """ + if not text: + return text + + text = text.strip() + + # First, try to find JSON within code fences + # Pattern: ```json ... ``` or ``` ... ``` + code_fence_pattern = r'```(?:json)?\s*\n?(.*?)\n?```' + matches = re.findall(code_fence_pattern, text, re.DOTALL) + if matches: + # Use the last match (in case there are multiple) + text = matches[-1].strip() + + # Now try to extract just the JSON object/array + # Find the first { or [ and match to its closing bracket + import json + for i, char in enumerate(text): + if char in '{[': + # Try to parse from this position + try: + # Use JSONDecoder to find where valid JSON ends + decoder = json.JSONDecoder() + obj, end_idx = decoder.raw_decode(text[i:]) + # Return the valid JSON string + return text[i:i+end_idx] + except json.JSONDecodeError: + continue + + return text.strip() + + +class Expert: + """Base class for MoE experts""" + + def __init__(self, groq_client: GroqClient): + """ + Initialize expert + + Args: + groq_client: Groq client for LLM calls + """ + self.groq_client = groq_client + + async def call_llm(self, prompt: str) -> str: + """ + Call Groq Expert model with the given prompt + + Args: + prompt: Prompt text + + Returns: + LLM response text (empty string on error) + """ + try: + response = await self.groq_client.generate_expert(prompt, max_tokens=512) + return response if response else "" + except Exception as e: + logger.error(f"LLM call failed in {self.__class__.__name__}: {e}") + return "" + + async def verify(self, thought: str, context: str, graph_history: List[Dict]) -> Dict: + """ + Verify a thought (to be implemented by subclasses) + + Args: + thought: The thought to verify + context: Retrieved context/chunks + graph_history: Last 3 nodes from the graph + + Returns: + Dict with score and remarks + """ + raise NotImplementedError + + +class HallucinationHunter(Expert): + """ + Expert that detects hallucinations by comparing thought against context + """ + + async def verify(self, thought: str, context: str, graph_history: List[Dict]) -> Dict: + """ + Check if the thought contains any information not present in the context + Uses STRICT verification - only explicit facts from context are allowed + + Returns: + { + "score": float (0-1), + "passed": bool, + "remarks": str + } + """ + prompt = f"""You are a Hallucination Hunter for MetaKGP wiki verification. Your job is to detect if the bot is making up details. + +Context from MetaKGP wiki: +{context} + +Thought to Verify: +{thought} + +STRICT ANALYSIS PROTOCOL: +1. What specific details does the Thought make? +2. Which of these details are ACTUALLY present in the Context? +3. Which details appear to be INVENTED or INFERRED (not in the context)? + +Be STRICT: If something isn't explicitly in the context, it's hallucination. + +Examples: +βœ“ PASS: Context says "John is President of TFPS" β†’ Thought: "John leads TFPS" (reasonable paraphrase) +βœ— FAIL: Context says "John is President of TFPS" β†’ Thought: "John has been President since 2020" (invented date) +βœ“ PASS: Thought says "Information not available" or "Insufficient information" (acknowledging limitation) +βœ— FAIL: Context has no mention of topic X β†’ Thought makes specific claims about X (pure hallucination) + +CRITICAL: Return ONLY a valid JSON object, nothing else. No explanations, no markdown formatting. + +Output format: +{{ + "hallucinations_found": ["list specific details that are NOT in the context"], + "confidence": 0.0-1.0, + "verdict": "PASS" or "FAIL" +}} + +Guidelines for confidence: +- 0.9-1.0: Definitely hallucinating, multiple invented details +- 0.7-0.8: Likely hallucinating, some unsupported claims +- 0.5-0.6: Borderline, minor unsupported inferences +- 0.3-0.4: Mostly grounded, reasonable interpretations +- 0.0-0.2: Fully grounded in context""" + + response = await self.call_llm(prompt) + + # Log the raw response for debugging + if not response or not response.strip(): + logger.error(f"HallucinationHunter received empty response from LLM") + return { + "score": 0.6, + "passed": True, + "remarks": "LLM returned empty response, defaulting to PASS", + "expert": "HallucinationHunter" + } + + try: + # Strip markdown code fences before parsing + import json + cleaned_response = strip_markdown_json(response) + result = json.loads(cleaned_response.strip()) + + hallucinations = result.get("hallucinations_found", []) + confidence = float(result.get("confidence", 0.6)) + verdict = result.get("verdict", "PASS") # Default to PASS + + # More lenient: pass if confidence >= 0.4 + passed = verdict == "PASS" or (confidence >= 0.4 and len(hallucinations) <= 1) + score = confidence if passed else (1.0 - confidence) + + remarks = f"Hallucinations: {', '.join(hallucinations)}" if hallucinations else "No hallucinations detected" + + return { + "score": score, + "passed": passed, + "remarks": remarks, + "expert": "HallucinationHunter" + } + + except Exception as e: + logger.error(f"Failed to parse HallucinationHunter response: {e}") + logger.error(f"Raw response was: {response}") + logger.error(f"Cleaned response was: {cleaned_response}") + return { + "score": 0.3, + "passed": False, + "remarks": f"Parse error: {e}", + "expert": "HallucinationHunter" + } + + +class SourceMatcher(Expert): + """ + Expert that verifies the thought is semantically contained in the context + """ + + async def verify(self, thought: str, context: str, graph_history: List[Dict]) -> Dict: + """ + Check if the meaning of the thought is directly supported by the retrieved chunks + Uses STRICT verification - requires explicit support in context + + Returns: + { + "score": float (0-1), + "passed": bool, + "remarks": str + } + """ + prompt = f"""You are a Source Matcher expert. Your job is to verify if a claim is directly supported by source text. + +Context from MetaKGP wiki: +{context} + +Thought to Verify: +{thought} + +STRICT VERIFICATION PROTOCOL: +Question: Does the SOURCE TEXT explicitly contain information that directly supports this Thought? + +Analysis Steps: +1. Identify the key claims in the Thought +2. For each claim, search the Context for explicit supporting evidence +3. A claim is supported ONLY if the Context contains the same or equivalent information +4. Paraphrasing is acceptable, but inferences beyond what's stated are NOT + +Examples: +βœ“ PASS: Context: "TFPS conducts photography workshops" β†’ Thought: "Technology Film and Photography Society organizes photography workshops" +βœ“ PASS: Context: "John is VP of TSG" β†’ Thought: "John holds the Vice President position at Technology Students' Gymkhana" +βœ— FAIL: Context: "TFPS has 50 members" β†’ Thought: "TFPS is the largest society" (unsupported comparison) +βœ— FAIL: Context mentions "workshop" β†’ Thought: "weekly workshop series" (added frequency not in context) + +Be STRICT: The source must actually contain the claim, not just related information. + +CRITICAL: Return ONLY a valid JSON object, nothing else. No explanations, no markdown formatting. + +Output format: +{{ + "verdict": "YES" or "NO", + "confidence": 0.0 to 1.0, + "reasoning": "Brief explanation of which claims are/aren't supported" +}} + +Confidence Guidelines: +- 1.0: All claims explicitly in context +- 0.8-0.9: Most claims supported, minor paraphrasing +- 0.6-0.7: Some claims supported, some inferred +- 0.4-0.5: Few claims supported, mostly inferred +- 0.0-0.3: Claims not in context or contradicted""" + + response = await self.call_llm(prompt) + + # Log the raw response for debugging + if not response or not response.strip(): + logger.error(f"SourceMatcher received empty response from LLM") + return { + "score": 0.6, + "passed": True, + "remarks": "LLM returned empty response, defaulting to PASS", + "expert": "SourceMatcher" + } + + try: + import json + # Strip markdown code fences before parsing + cleaned_response = strip_markdown_json(response) + result = json.loads(cleaned_response.strip()) + + verdict = result.get("verdict", "NO") + confidence = float(result.get("confidence", 0.5)) + reasoning = result.get("reasoning", "") + + # Strict: only pass if verdict is YES AND confidence >= 0.6 + passed = verdict == "YES" and confidence >= 0.6 + score = confidence if passed else confidence * 0.5 # Penalize failures + + return { + "score": score, + "passed": passed, + "remarks": reasoning, + "expert": "SourceMatcher" + } + + except Exception as e: + logger.error(f"Failed to parse SourceMatcher response: {e}") + logger.error(f"Raw response was: {response}") + logger.error(f"Cleaned response was: {cleaned_response}") + return { + "score": 0.3, + "passed": False, + "remarks": f"Parse error: {e}", + "expert": "SourceMatcher" + } + + +class LogicExpert(Expert): + """ + Expert that ensures the reasoning chain makes sense + """ + + async def verify(self, thought: str, context: str, graph_history: List[Dict]) -> Dict: + """ + Check if the thought follows logically from the premises in the context + Uses formal logic principles to verify reasoning + + Returns: + { + "score": float (0-1), + "passed": bool, + "remarks": str, + "action": "keep" | "merge" | "discard" + } + """ + # Format graph history + history_text = "" + for i, node in enumerate(graph_history[-3:], 1): + history_text += f"{i}. {node.get('thought', '')}\n" + + prompt = f"""You are a Logic Expert. Your job is to ensure logical consistency and verify that conclusions follow from premises. + +Context (Premises): +{context} + +Thought (Conclusion to Verify): +{thought} + +Previous Reasoning Steps: +{history_text if history_text else "This is the first node."} + +FORMAL LOGIC ANALYSIS: + +Step 1: Extract Premises +- Identify the key facts/premises stated in the Context + +Step 2: Extract Conclusion +- Identify the conclusion/claim made in the Thought + +Step 3: Verify Logical Flow +- Does the conclusion logically follow from the premises? +- Are there any logical fallacies or unsupported jumps in reasoning? + +Examples: +βœ“ LOGICAL: + Premises: "John is taller than Mary. Mary is taller than Sam." + Conclusion: "John is taller than Sam." (Valid transitive reasoning) + +βœ— ILLOGICAL: + Premises: "Cats are animals." + Conclusion: "Cats can talk." (Non-sequitur, doesn't follow) + +βœ“ LOGICAL: + Premises: "TFPS conducts photography workshops. Photography workshops teach camera techniques." + Conclusion: "TFPS teaches camera techniques." (Valid syllogism) + +βœ— ILLOGICAL: + Premises: "Event X happened in 2023." + Conclusion: "Event X happens every year." (Unsupported generalization) + +Step 4: Check Redundancy +- Is this Thought adding new information? +- Or is it repeating what was already established in previous steps? + +CRITICAL: Return ONLY a valid JSON object, nothing else. No explanations, no markdown formatting. + +Output format: +{{ + "is_logical": true/false, + "confidence": 0.0-1.0, + "reasoning": "Explanation of the logical flow (or lack thereof)", + "is_redundant": true/false, + "action": "keep" | "merge" | "discard" +}} + +Action Guidelines: +- "keep": Logical and adds new information +- "merge": Logical but redundant with previous steps +- "discard": Illogical or completely unsupported + +Confidence Guidelines: +- 0.9-1.0: Clearly follows from premises, no logical issues +- 0.7-0.8: Mostly logical, minor inference gaps +- 0.5-0.6: Some logical connection, requires assumptions +- 0.3-0.4: Weak logical connection, significant assumptions +- 0.0-0.2: No logical connection or fallacious reasoning""" + + response = await self.call_llm(prompt) + + # Log the raw response for debugging + if not response or not response.strip(): + logger.error(f"LogicExpert received empty response from LLM") + return { + "score": 0.7, + "passed": True, + "remarks": "LLM returned empty response, defaulting to PASS", + "action": "keep", + "expert": "LogicExpert" + } + + try: + import json + # Strip markdown code fences before parsing + cleaned_response = strip_markdown_json(response) + result = json.loads(cleaned_response.strip()) + + is_logical = result.get("is_logical", False) + confidence = float(result.get("confidence", 0.5)) + reasoning = result.get("reasoning", "") + is_redundant = result.get("is_redundant", False) + action = result.get("action", "keep") + + # Strict: pass only if is_logical=true AND confidence >= 0.6 + passed = is_logical and confidence >= 0.6 + + # Override action if not logical + if not is_logical: + action = "discard" + + return { + "score": confidence, + "passed": passed, + "remarks": reasoning, + "action": action, + "expert": "LogicExpert" + } + + except Exception as e: + logger.error(f"Failed to parse LogicExpert response: {e}") + logger.error(f"Raw response was: {response}") + logger.error(f"Cleaned response was: {cleaned_response}") + return { + "score": 0.5, + "passed": True, + "remarks": f"Parse error: {e}", + "action": "keep", + "expert": "LogicExpert" + } + + +class MoEGauntlet: + """ + Orchestrates the three experts with weighted voting + """ + + def __init__(self, groq_client: GroqClient): + """ + Initialize the MoE Gauntlet with Groq client + + Args: + groq_client: Groq client for expert calls + """ + self.hallucination_hunter = HallucinationHunter(groq_client=groq_client) + self.source_matcher = SourceMatcher(groq_client=groq_client) + self.logic_expert = LogicExpert(groq_client=groq_client) + + async def verify_thought( + self, + thought: str, + context: str, + graph_history: List[Dict] + ) -> Dict: + """ + Run all three experts in parallel and compute weighted vote using STRICT consensus + + Weighted Voting Formula (STRICT): + final_score = (source_matcher_score * 0.5) + (hallucination_score * 0.3) + (logic_score * 0.2) + + CONSENSUS RULE (All experts must agree): + - Source Matcher: MUST pass (verdict=YES, confidence >= 0.6) + - Hallucination Hunter: MUST pass (no hallucinations detected) + - Logic Expert: MUST pass (is_logical=true, confidence >= 0.6) + + Pass threshold: final_score >= 0.6 AND all three experts pass + + Args: + thought: The thought to verify + context: Retrieved context chunks + graph_history: Last 3 nodes from the graph + + Returns: + { + "passed": bool, + "final_score": float, + "action": str, + "expert_results": dict, + "remarks": str + } + """ + logger.info(f"Running MoE Gauntlet (STRICT MODE) on thought: {thought[:100]}...") + + # Run all experts in parallel + results = await asyncio.gather( + self.hallucination_hunter.verify(thought, context, graph_history), + self.source_matcher.verify(thought, context, graph_history), + self.logic_expert.verify(thought, context, graph_history), + return_exceptions=True + ) + + hallucination_result, source_result, logic_result = results + + # Handle any exceptions - be strict with failures + if isinstance(hallucination_result, Exception): + hallucination_result = {"score": 0.0, "passed": False, "remarks": f"Expert failed: {str(hallucination_result)}"} + if isinstance(source_result, Exception): + source_result = {"score": 0.0, "passed": False, "remarks": f"Expert failed: {str(source_result)}"} + if isinstance(logic_result, Exception): + logic_result = {"score": 0.0, "passed": False, "action": "discard", "remarks": f"Expert failed: {str(logic_result)}"} + + # Weighted voting (strict) + # Source Matcher (50%), Hallucination Hunter (30%), Logic Expert (20%) + final_score = ( + source_result["score"] * 0.5 + + hallucination_result["score"] * 0.3 + + logic_result["score"] * 0.2 + ) + + # Check if each expert passes + source_pass = source_result["passed"] + hallucination_pass = hallucination_result["passed"] + logic_pass = logic_result["passed"] + + # STRICT CONSENSUS RULE: ALL three experts must pass + experts_passed = sum([source_pass, hallucination_pass, logic_pass]) + all_experts_agree = experts_passed == 3 + + # Build failure reasons + failure_reasons = [] + if not source_pass: + failure_reasons.append("Source not found") + if not hallucination_pass: + failure_reasons.append("Hallucination detected") + if not logic_pass: + failure_reasons.append("Illogical reasoning") + + # Determine overall verdict + if all_experts_agree and final_score >= 0.6: + # Perfect consensus with high score + passed = True + action = logic_result.get("action", "keep") + remarks = f"βœ“ VERIFIED: Expert consensus achieved (Score: {final_score:.2f}/1.0, All 3/3 experts passed)" + elif final_score < 0.6: + # Score too low + passed = False + action = "discard" + reasons = ", ".join(failure_reasons) if failure_reasons else "Low confidence" + remarks = f"βœ— REJECTED: Score too low ({final_score:.2f} < 0.6). Issues: {reasons}" + elif not all_experts_agree: + # Not all experts agree + passed = False + action = "discard" + reasons = ", ".join(failure_reasons) + remarks = f"βœ— REJECTED: Expert consensus failed ({experts_passed}/3 passed). Issues: {reasons}" + else: + # Edge case + passed = False + action = "discard" + remarks = f"βœ— REJECTED: Verification criteria not met" + + logger.info(f"MoE Verdict (STRICT): {remarks}") + + return { + "passed": passed, + "final_score": final_score, + "action": action, + "expert_results": { + "hallucination_hunter": hallucination_result, + "source_matcher": source_result, + "logic_expert": logic_result + }, + "experts_passed": experts_passed, + "failure_reasons": failure_reasons, + "remarks": remarks + } diff --git a/submissions/team_2/chatbot/backend/src/services/chat_agent/router.py b/submissions/team_2/chatbot/backend/src/services/chat_agent/router.py new file mode 100644 index 0000000..c1f50ab --- /dev/null +++ b/submissions/team_2/chatbot/backend/src/services/chat_agent/router.py @@ -0,0 +1,145 @@ +""" +Simplified Graph of Thought (GoT) Service Router +FastAPI endpoints for the simplified GoT reasoning service using Llama-4-Scout +""" + +from fastapi import APIRouter, HTTPException, Depends +from pydantic import BaseModel, Field +from typing import Optional, Dict, List +import logging +import os + +from src.services.chat_agent.engine import SimplifiedGoTEngine + +logger = logging.getLogger(__name__) + +# Create router +router = APIRouter( + prefix="/got", + tags=["graph-of-thought"], + responses={404: {"description": "Not found"}}, +) + +# Global engine instance +_got_engine: Optional[SimplifiedGoTEngine] = None + + +def get_got_engine() -> SimplifiedGoTEngine: + """Dependency to get the GoT engine instance""" + if _got_engine is None: + raise HTTPException( + status_code=503, + detail="GoT engine not initialized" + ) + return _got_engine + + +def set_got_engine(engine: SimplifiedGoTEngine): + """Set the global GoT engine instance""" + global _got_engine + _got_engine = engine + + +# Pydantic models +class GoTQueryRequest(BaseModel): + """Request model for GoT query""" + query: str = Field(..., description="The question to answer using MetaKGP wiki") + + +class GoTQueryResponse(BaseModel): + """Response model for GoT query""" + query: str + answer: str + confidence: float = Field(..., ge=0.0, le=1.0) + chunks_retrieved: int + verification_passed: bool + verification_score: Optional[float] = None + reasoning: Optional[str] = None + sources: Optional[List[str]] = None + error: Optional[str] = None + + +class GraphStatusResponse(BaseModel): + """Response model for graph status""" + status: str + engine_initialized: bool + model: str + top_k: int + + +# API Endpoints + +@router.post("/query", response_model=GoTQueryResponse) +async def query_got( + request: GoTQueryRequest, + engine: SimplifiedGoTEngine = Depends(get_got_engine) +): + """ + Process a query using simplified Graph of Thought reasoning with Llama-3.3-70b + + Pipeline: + 1. Check if query is relevant to IIT Kharagpur/MetaKGP + 2. Query RAG for top 30 relevant chunks + 3. Analyze chunks using Graph of Thought reasoning + 4. Run single MoE verification round (3 experts) + 5. Generate final answer + + Example request: + ```json + { + "query": "What is the hostel allocation process at IIT Kharagpur?" + } + ``` + + The response includes: + - Final answer + - Confidence score + - Verification status + - Source pages + """ + try: + logger.info(f"Received GoT query: {request.query}") + + # Process query through simplified GoT engine + result = await engine.process_query(query=request.query) + + return GoTQueryResponse(**result) + + except Exception as e: + logger.error(f"GoT query failed: {e}", exc_info=True) + raise HTTPException( + status_code=500, + detail=f"Query processing failed: {str(e)}" + ) + + +@router.get("/graph-status", response_model=GraphStatusResponse) +async def graph_status(engine: SimplifiedGoTEngine = Depends(get_got_engine)): + """ + Get status of the GoT engine + + Returns: + - Engine initialization status + - Model being used + - Configuration details + """ + try: + return GraphStatusResponse( + status="ok", + engine_initialized=True, + model="meta-llama/llama-4-scout-17b-16e-instruct", + top_k=engine.top_k + ) + + except Exception as e: + logger.error(f"Status check failed: {e}") + raise HTTPException( + status_code=500, + detail=f"Status check failed: {str(e)}" + ) + + +@router.get("/health") +async def health(): + """Simple health check""" + return {"status": "ok", "service": "GoT"} diff --git a/submissions/team_2/chatbot/backend/src/services/indexing/__init__.py b/submissions/team_2/chatbot/backend/src/services/indexing/__init__.py new file mode 100644 index 0000000..d7c00f5 --- /dev/null +++ b/submissions/team_2/chatbot/backend/src/services/indexing/__init__.py @@ -0,0 +1 @@ +"""Indexing service for processing and embedding wiki content""" diff --git a/submissions/team_2/chatbot/backend/src/services/indexing/indexer.py b/submissions/team_2/chatbot/backend/src/services/indexing/indexer.py new file mode 100644 index 0000000..68a9296 --- /dev/null +++ b/submissions/team_2/chatbot/backend/src/services/indexing/indexer.py @@ -0,0 +1,469 @@ +#!/usr/bin/env python3 +""" +MetaKGP Wiki Indexer Service +PostgreSQL β†’ Chunk β†’ Embed β†’ ChromaDB +""" + +import argparse +import logging +import os +import pickle +import sys +import time +from pathlib import Path +from typing import List, Dict, Optional + +# Add parent directory to path for imports +sys.path.insert(0, str(Path(__file__).parent.parent.parent)) + +from dotenv import load_dotenv +from sqlalchemy import create_engine, text + +from src.utils.embedding_client import ModalEmbeddingClient +from src.utils.chunk_processor import WikiChunkProcessor +from src.utils.chroma_client import MetaKGPChromaClient + +# Setup logging +logging.basicConfig( + level=logging.INFO, + format='%(asctime)s - %(name)s - %(levelname)s - %(message)s', + datefmt='%Y-%m-%d %H:%M:%S' +) +logger = logging.getLogger(__name__) + + +class MetaKGPIndexer: + """Indexer service for MetaKGP wiki pages""" + + def __init__( + self, + db_url: str, + modal_url: str, + chroma_dir: str = "./chroma_data", + cache_dir: str = "./cache", + batch_size: int = 100, + embedding_batch_size: int = 30, + reset_offset: bool = False + ): + """ + Initialize MetaKGP indexer + + Args: + db_url: PostgreSQL connection URL + modal_url: Modal embedding service URL + chroma_dir: ChromaDB persistence directory + cache_dir: Cache directory for embeddings and offset + batch_size: Number of pages to fetch per batch + embedding_batch_size: Number of chunks to embed at once + reset_offset: Reset offset and reindex from beginning + """ + self.db_url = db_url + self.batch_size = batch_size + self.embedding_batch_size = embedding_batch_size + + # Initialize components + logger.info(" Initializing MetaKGP Indexer...") + + self.embedding_client = ModalEmbeddingClient(modal_url) + self.chunk_processor = WikiChunkProcessor(chunk_size=512, chunk_overlap=50) + self.chroma_client = MetaKGPChromaClient( + persist_dir=chroma_dir, + collection_name="metakgp_wiki" + ) + + # Cache setup + self.cache_dir = Path(cache_dir) + self.cache_dir.mkdir(parents=True, exist_ok=True) + + self.embedding_cache_file = self.cache_dir / "embeddings.pkl" + self.offset_file = self.cache_dir / "last_processed_id.txt" + + # Load embedding cache + self.embedding_cache: Dict[str, List[float]] = self._load_embedding_cache() + self._cache_hits = 0 + self._cache_misses = 0 + + # Reset offset if requested + if reset_offset and self.offset_file.exists(): + logger.warning(" Resetting offset - will reindex from beginning") + self.offset_file.unlink() + + # Statistics + self._pages_processed = 0 + self._chunks_indexed = 0 + self._start_time = time.time() + + logger.info(" Indexer initialized successfully") + + def _load_embedding_cache(self) -> Dict[str, List[float]]: + """Load embedding cache from disk""" + if self.embedding_cache_file.exists(): + try: + with open(self.embedding_cache_file, 'rb') as f: + cache = pickle.load(f) + logger.info(f" Loaded {len(cache)} cached embeddings") + return cache + except Exception as e: + logger.warning(f"️ Could not load embedding cache: {e}") + return {} + + def _save_embedding_cache(self): + """Save embedding cache to disk""" + try: + with open(self.embedding_cache_file, 'wb') as f: + pickle.dump(self.embedding_cache, f) + logger.info( + f" Saved {len(self.embedding_cache)} embeddings to cache " + f"(hits: {self._cache_hits}, misses: {self._cache_misses})" + ) + except Exception as e: + logger.error(f" Failed to save embedding cache: {e}") + + def _get_last_processed_id(self) -> int: + """Get last processed page ID from offset file""" + if self.offset_file.exists(): + try: + id_str = self.offset_file.read_text().strip() + return int(id_str) if id_str else 0 + except Exception as e: + logger.warning(f"️ Could not read offset file: {e}") + return 0 + + def _save_last_processed_id(self, page_id: int): + """Save last processed page ID to offset file""" + try: + self.offset_file.write_text(str(page_id)) + except Exception as e: + logger.error(f" Failed to save offset: {e}") + + def _get_embedding(self, text: str) -> Optional[List[float]]: + """ + Get embedding with caching + + Uses first 200 characters as cache key for deterministic lookups + + Args: + text: Text to embed + + Returns: + Embedding vector or None if failed + """ + # Use first 200 chars as cache key + cache_key = text[:200] + + # Check cache + if cache_key in self.embedding_cache: + self._cache_hits += 1 + return self.embedding_cache[cache_key] + + # Cache miss - call Modal API + self._cache_misses += 1 + embedding = self.embedding_client(text) + + if embedding: + self.embedding_cache[cache_key] = embedding + return embedding + + logger.error("Failed to generate embedding") + return None + + def _fetch_pages_batch( + self, + engine, + last_id: int + ) -> List[Dict]: + """ + Fetch next batch of pages from PostgreSQL + + Args: + engine: SQLAlchemy engine + last_id: Last processed page ID + + Returns: + List of page dictionaries + """ + try: + with engine.connect() as conn: + query = text(""" + SELECT + id, + name, + title, + cleaned_text, + categories, + links + FROM metakgp_pages + WHERE id > :last_id + AND exists = true + AND redirect = false + AND cleaned_text IS NOT NULL + AND cleaned_text != '' + ORDER BY id ASC + LIMIT :batch_size + """) + + result = conn.execute( + query, + {"last_id": last_id, "batch_size": self.batch_size} + ) + + pages = [dict(row._mapping) for row in result] + return pages + + except Exception as e: + logger.error(f" Failed to fetch pages: {e}") + return [] + + def index_batch(self, pages: List[Dict]): + """ + Process and index a batch of pages + + Pipeline: + 1. Chunk pages with entity extraction + 2. Generate embeddings (with caching) + 3. Add to ChromaDB + + Args: + pages: List of page dictionaries from database + """ + if not pages: + return + + logger.info(f" Processing {len(pages)} pages...") + + # Step 1: Process pages into chunks + all_chunk_objects = [] + + for page in pages: + try: + chunks = self.chunk_processor.process_page( + page_name=page["name"], + title=page["title"] or page["name"], + cleaned_text=page["cleaned_text"] or "", + categories=page["categories"] or [], + links=page["links"] or [] + ) + all_chunk_objects.extend(chunks) + except Exception as e: + logger.error(f" Failed to process page {page['name']}: {e}") + + if not all_chunk_objects: + logger.warning("️ No chunks generated from batch") + return + + logger.info( + f"️ Generated {len(all_chunk_objects)} chunks " + f"from {len(pages)} pages" + ) + + # Step 2: Generate embeddings + chunk_ids = [] + texts = [] + embeddings = [] + successful_chunks = [] + + for chunk_obj in all_chunk_objects: + try: + # Get embedding (with cache) + embedding = self._get_embedding(chunk_obj["text"]) + + if embedding: + chunk_ids.append(chunk_obj["chunk_id"]) + texts.append(chunk_obj["text"]) + embeddings.append(embedding) + successful_chunks.append(chunk_obj) + else: + logger.warning(f"️ Skipping chunk {chunk_obj['chunk_id']} - no embedding") + + except Exception as e: + logger.error(f" Failed to embed chunk: {e}") + + logger.info(f" Generated {len(embeddings)} embeddings") + + # Step 3: Add to ChromaDB + if chunk_ids: + try: + self.chroma_client.add_chunks( + chunk_ids=chunk_ids, + texts=texts, + embeddings=embeddings, + chunk_objects=successful_chunks + ) + self._chunks_indexed += len(chunk_ids) + except Exception as e: + logger.error(f" Failed to add chunks to ChromaDB: {e}") + + # Update statistics + self._pages_processed += len(pages) + + # Save cache periodically + if self._pages_processed % 10 == 0: + self._save_embedding_cache() + + def _print_stats(self): + """Print indexing statistics""" + elapsed = time.time() - self._start_time + + logger.info("=" * 60) + logger.info(" Indexing Statistics:") + logger.info(f" Pages processed: {self._pages_processed}") + logger.info(f" Chunks indexed: {self._chunks_indexed}") + logger.info(f" Cache hits: {self._cache_hits}") + logger.info(f" Cache misses: {self._cache_misses}") + if self._cache_hits + self._cache_misses > 0: + hit_rate = 100 * self._cache_hits / (self._cache_hits + self._cache_misses) + logger.info(f" Cache hit rate: {hit_rate:.1f}%") + logger.info(f" Elapsed time: {elapsed:.1f}s") + logger.info(f" Total docs in ChromaDB: {self.chroma_client.get_count()}") + logger.info("=" * 60) + + def run(self): + """Main indexing loop""" + logger.info(" Starting indexing service") + logger.info(f" Database: {self.db_url.split('@')[-1]}") # Hide credentials + + engine = create_engine(self.db_url) + last_id = self._get_last_processed_id() + + logger.info(f" Starting from page ID: {last_id}") + + try: + iteration = 0 + + while True: + iteration += 1 + + # Fetch next batch + pages = self._fetch_pages_batch(engine, last_id) + + if not pages: + logger.info(" No more pages to index") + self._print_stats() + logger.info(" Sleeping for 60s before checking for new pages...") + time.sleep(60) + continue + + logger.info( + f"\n{'='*60}\n" + f"Iteration {iteration}: Fetched {len(pages)} pages " + f"(IDs {pages[0]['id']} - {pages[-1]['id']})\n" + f"{'='*60}" + ) + + # Process batch + self.index_batch(pages) + + # Update offset + last_id = pages[-1]["id"] + self._save_last_processed_id(last_id) + + # Print periodic stats + if iteration % 5 == 0: + self._print_stats() + + # Brief pause between batches + time.sleep(2) + + except KeyboardInterrupt: + logger.info("\n⏹️ Stopping indexer (Ctrl+C)") + + except Exception as e: + logger.error(f" Fatal error: {e}", exc_info=True) + + finally: + # Save cache and print final stats + self._save_embedding_cache() + self._print_stats() + + +def main(): + """Main entry point""" + # Load environment variables from team_2 root directory + env_path = Path(__file__).resolve().parents[5] / '.env' # Go up to team_2/ + if env_path.exists(): + load_dotenv(env_path) + logger.info(f"Loaded .env from {env_path}") + else: + # Fallback to default behavior + load_dotenv() + logger.warning(f".env not found at {env_path}, using default load_dotenv()") + + # Construct DATABASE_URL from environment variables + db_host = os.getenv("DB_HOST") + db_port = os.getenv("DB_PORT", "5432") + db_name = os.getenv("DB_NAME") + db_user = os.getenv("DB_USER") + db_password = os.getenv("DB_PASSWORD") + db_sslmode = os.getenv("DB_SSLMODE", "require") + + # Build connection URL + if db_host and db_name and db_user and db_password: + db_url = f"postgresql://{db_user}:{db_password}@{db_host}:{db_port}/{db_name}?sslmode={db_sslmode}" + else: + # Fallback to DATABASE_URL if individual components not provided + db_url = os.getenv("DATABASE_URL") + if not db_url: + raise ValueError("Database configuration missing. Set DB_HOST, DB_NAME, DB_USER, DB_PASSWORD or DATABASE_URL") + + parser = argparse.ArgumentParser( + description="MetaKGP Wiki Indexer Service", + formatter_class=argparse.ArgumentDefaultsHelpFormatter + ) + + parser.add_argument( + "--db-url", + default=db_url, + help="PostgreSQL connection URL (e.g., postgresql://user:pass@host:5432/dbname)" + ) + parser.add_argument( + "--modal-url", + default=os.getenv("MODAL_URL"), + help="Modal embedding service URL (e.g., https://...modal.run)" + ) + parser.add_argument( + "--chroma-dir", + default=os.getenv("CHROMA_DIR", "./chroma_data"), + help="ChromaDB persistence directory" + ) + parser.add_argument( + "--cache-dir", + default=os.getenv("CACHE_DIR", "./cache"), + help="Cache directory for embeddings and offset tracking" + ) + parser.add_argument( + "--batch-size", + type=int, + default=100, + help="Number of pages to fetch per batch" + ) + parser.add_argument( + "--embedding-batch-size", + type=int, + default=30, + help="Number of chunks to embed in parallel" + ) + parser.add_argument( + "--reset-offset", + action="store_true", + help="Reset offset and reindex from beginning" + ) + + args = parser.parse_args() + + # Create indexer + indexer = MetaKGPIndexer( + db_url=args.db_url, + modal_url=args.modal_url, + chroma_dir=args.chroma_dir, + cache_dir=args.cache_dir, + batch_size=args.batch_size, + embedding_batch_size=args.embedding_batch_size, + reset_offset=args.reset_offset + ) + + # Run + indexer.run() + + +if __name__ == "__main__": + main() diff --git a/submissions/team_2/chatbot/backend/src/services/query_service/__init__.py b/submissions/team_2/chatbot/backend/src/services/query_service/__init__.py new file mode 100644 index 0000000..6cd3c47 --- /dev/null +++ b/submissions/team_2/chatbot/backend/src/services/query_service/__init__.py @@ -0,0 +1 @@ +"""Query service API for semantic search""" diff --git a/submissions/team_2/chatbot/backend/src/services/query_service/router.py b/submissions/team_2/chatbot/backend/src/services/query_service/router.py new file mode 100644 index 0000000..869e55b --- /dev/null +++ b/submissions/team_2/chatbot/backend/src/services/query_service/router.py @@ -0,0 +1,178 @@ +""" +MetaKGP Query Service Router +FastAPI routes for semantic search +""" + +from fastapi import APIRouter, HTTPException, Depends +from pydantic import BaseModel, Field +from typing import List, Optional +import logging + +from src.services.query_service.service import QueryService + +logger = logging.getLogger(__name__) + +# Create router +router = APIRouter( + prefix="/query", + tags=["query"], + responses={404: {"description": "Not found"}}, +) + +# Global service instance (will be set by main app) +_query_service: Optional[QueryService] = None + + +def get_query_service() -> QueryService: + """Dependency to get the query service instance""" + if _query_service is None: + raise HTTPException( + status_code=503, + detail="Query service not initialized" + ) + return _query_service + + +def set_query_service(service: QueryService): + """Set the global query service instance""" + global _query_service + _query_service = service + + +# Pydantic models +class SearchFilters(BaseModel): + """Optional filters for search""" + source_page: Optional[str] = Field(None, description="Filter by specific page name") + category: Optional[str] = Field(None, description="Filter by category") + min_entity_count: Optional[int] = Field(None, description="Minimum number of entities") + + +class SearchRequest(BaseModel): + """Search request model""" + query: str = Field(..., description="Search query text") + top_k: int = Field(10, ge=1, le=100, description="Number of results to return") + filters: Optional[SearchFilters] = Field(None, description="Optional metadata filters") + + +class SearchResultMetadata(BaseModel): + """Metadata for a search result""" + source_page: str + title: str + chunk_index: int + total_chunks: int + categories: List[str] + entities: List[str] + entity_count: int + relationship_count: int + + +class SearchResult(BaseModel): + """Single search result""" + chunk_id: str + text: str + score: float = Field(..., ge=0.0, le=1.0, description="Relevance score (0-1)") + metadata: SearchResultMetadata + + +class SearchResponse(BaseModel): + """Search response with results and metadata""" + results: List[SearchResult] + query_time_ms: float + total_results: int + + +class HealthResponse(BaseModel): + """Health check response""" + status: str + num_documents: int + embedding_service: str + + +# API Endpoints + +@router.post("/search", response_model=SearchResponse) +async def search( + request: SearchRequest, + service: QueryService = Depends(get_query_service) +): + """ + Semantic search over wiki chunks + + Example request: + ```json + { + "query": "Who is the vice president of TSG?", + "top_k": 5, + } + ``` + """ + try: + # Prepare filters + filters_dict = None + category_filter = None + if request.filters: + filters_dict = request.filters.model_dump(exclude_none=True) + # Extract category for post-processing (ChromaDB doesn't support substring matching) + category_filter = filters_dict.pop("category", None) + + # Perform search + result = service.search( + query=request.query, + top_k=request.top_k, + filters=filters_dict, + category_filter=category_filter + ) + + # Convert dict results to Pydantic models + search_results = [ + SearchResult( + chunk_id=r["chunk_id"], + text=r["text"], + score=r["score"], + metadata=SearchResultMetadata(**r["metadata"]) + ) + for r in result["results"] + ] + + return SearchResponse( + results=search_results, + query_time_ms=result["query_time_ms"], + total_results=result["total_results"] + ) + + except ValueError as e: + raise HTTPException( + status_code=500, + detail=str(e) + ) + + except Exception as e: + logger.error(f"Search failed: {e}", exc_info=True) + raise HTTPException( + status_code=500, + detail=f"Search failed: {str(e)}" + ) + + +@router.get("/health", response_model=HealthResponse) +async def health(service: QueryService = Depends(get_query_service)): + """ + Health check endpoint + + Returns service status and document count + """ + try: + num_docs = service.get_document_count() + + return HealthResponse( + status="ok", + num_documents=num_docs, + embedding_service="modal" + ) + + except Exception as e: + logger.error(f"Health check failed: {e}") + raise HTTPException( + status_code=503, + detail=f"Service unhealthy: {str(e)}" + ) diff --git a/submissions/team_2/chatbot/backend/src/services/query_service/service.py b/submissions/team_2/chatbot/backend/src/services/query_service/service.py new file mode 100644 index 0000000..68a0b9f --- /dev/null +++ b/submissions/team_2/chatbot/backend/src/services/query_service/service.py @@ -0,0 +1,136 @@ +""" +MetaKGP Query Service - Business Logic +Handles semantic search operations over wiki chunks +""" + +import logging +import time +from typing import List, Optional, Dict + +from src.utils.embedding_client import ModalEmbeddingClient +from src.utils.chroma_client import MetaKGPChromaClient + +logger = logging.getLogger(__name__) + + +class QueryService: + """Service class for semantic search operations""" + + def __init__(self, modal_url: str, chroma_dir: str, collection_name: str = "metakgp_wiki"): + """ + Initialize the query service + + Args: + modal_url: URL of the Modal embedding service + chroma_dir: Directory for ChromaDB persistence + collection_name: Name of the ChromaDB collection + """ + self.embedding_client = ModalEmbeddingClient(modal_url) + self.chroma_client = MetaKGPChromaClient( + persist_dir=chroma_dir, + collection_name=collection_name + ) + logger.info(f"QueryService initialized with {self.chroma_client.get_count()} documents") + + def get_document_count(self) -> int: + """Get the total number of documents in the collection""" + return self.chroma_client.get_count() + + def search( + self, + query: str, + top_k: int = 10, + filters: Optional[Dict] = None, + category_filter: Optional[str] = None + ) -> Dict: + """ + Perform semantic search over wiki chunks + + Args: + query: Search query text + top_k: Number of results to return + filters: Optional metadata filters for ChromaDB + category_filter: Optional category filter (applied post-search) + + Returns: + Dict with 'results' (list of results), 'query_time_ms', and 'total_results' + """ + start_time = time.time() + + logger.info(f"Query: {query[:100]}...") + + # Generate query embedding + query_embedding = self.embedding_client(query) + + if not query_embedding: + raise ValueError("Failed to generate query embedding") + + logger.info(f"Generated embedding dimension: {len(query_embedding)}") + + # Search ChromaDB (get more results if we need to filter by category) + search_top_k = top_k * 2 if category_filter else top_k + + results = self.chroma_client.search( + query_embedding=query_embedding, + top_k=search_top_k, + filters=filters + ) + + logger.info(f"ChromaDB returned {len(results['ids'])} results") + + # Format results + search_results = [] + + for i, chunk_id in enumerate(results["ids"]): + # Convert distance to similarity score [0, 1] + distance = results["distances"][i] + score = 1.0 / (1.0 + distance) # Inverse distance normalization + + # Parse metadata + raw_metadata = results["metadatas"][i] + + # Deserialize arrays + categories = raw_metadata.get("categories", "").split(",") + categories = [c.strip() for c in categories if c.strip()] + + entities = raw_metadata.get("entities", "").split(",") + entities = [e.strip() for e in entities if e.strip()] + + # Build result dictionary + result = { + "chunk_id": chunk_id, + "text": results["documents"][i], + "score": score, + "metadata": { + "source_page": raw_metadata.get("source_page", ""), + "title": raw_metadata.get("title", ""), + "chunk_index": raw_metadata.get("chunk_index", 0), + "total_chunks": raw_metadata.get("total_chunks", 0), + "categories": categories, + "entities": entities, + "entity_count": raw_metadata.get("entity_count", 0), + "relationship_count": raw_metadata.get("relationship_count", 0) + } + } + + search_results.append(result) + + # Post-process: filter by category if requested + if category_filter: + search_results = [ + result for result in search_results + if category_filter.lower() in [cat.lower() for cat in result["metadata"]["categories"]] + ] + # Trim to requested top_k + search_results = search_results[:top_k] + + # Calculate query time + query_time_ms = (time.time() - start_time) * 1000 + + logger.info(f"Found {len(search_results)} results in {query_time_ms:.1f}ms") + + return { + "results": search_results, + "query_time_ms": query_time_ms, + "total_results": len(search_results) + } diff --git a/submissions/team_2/chatbot/backend/src/utils/__init__.py b/submissions/team_2/chatbot/backend/src/utils/__init__.py new file mode 100644 index 0000000..89671be --- /dev/null +++ b/submissions/team_2/chatbot/backend/src/utils/__init__.py @@ -0,0 +1 @@ +"""Utility modules for ChromaDB, embeddings, and text processing""" diff --git a/submissions/team_2/chatbot/backend/src/utils/chroma_client.py b/submissions/team_2/chatbot/backend/src/utils/chroma_client.py new file mode 100644 index 0000000..6a3503e --- /dev/null +++ b/submissions/team_2/chatbot/backend/src/utils/chroma_client.py @@ -0,0 +1,213 @@ +""" +ChromaDB Client for MetaKGP Wiki +Vector storage with rich metadata and graph relationships +""" + +import chromadb +from chromadb.config import Settings +from typing import List, Dict, Optional +import logging +import json + +logger = logging.getLogger(__name__) + + +class MetaKGPChromaClient: + """ChromaDB client for MetaKGP wiki chunks with entity/relationship support""" + + def __init__( + self, + persist_dir: str = "./chroma_data", + collection_name: str = "metakgp_wiki" + ): + """ + Initialize ChromaDB client + + Args: + persist_dir: Directory for persistent storage + collection_name: Name of the collection + """ + self.persist_dir = persist_dir + self.collection_name = collection_name + + # Initialize ChromaDB with persistent storage + self.client = chromadb.PersistentClient( + path=persist_dir, + settings=Settings( + anonymized_telemetry=False, + allow_reset=True + ) + ) + + # Get or create collection with cosine similarity + self.collection = self.client.get_or_create_collection( + name=collection_name, + metadata={ + "description": "MetaKGP wiki chunks with entities and relationships", + "hnsw:space": "cosine" + } + ) + + logger.info(f" ChromaDB collection '{collection_name}' initialized at {persist_dir}") + logger.info(f" Current document count: {self.collection.count()}") + + def _flatten_metadata(self, chunk_obj: Dict) -> Dict: + """ + Flatten nested chunk object for ChromaDB storage + + ChromaDB doesn't support nested dictionaries or arrays in metadata, + so we serialize complex types to strings. + + Args: + chunk_obj: Chunk object with nested metadata + + Returns: + Flattened metadata dictionary + """ + meta = chunk_obj["metadata"] + + return { + # Basic metadata + "source_page": meta["source_page"], + "title": meta["title"], + "chunk_index": meta["chunk_index"], + "total_chunks": meta["total_chunks"], + + # Serialize arrays as comma-separated strings + "categories": ",".join(meta["categories"]) if meta["categories"] else "", + "entities": ",".join(chunk_obj.get("entities", [])), + + # Store counts for filtering + "entity_count": len(chunk_obj.get("entities", [])), + "relationship_count": len(chunk_obj.get("relationships", [])), + + # Serialize relationships as JSON string + "relationships_json": json.dumps(chunk_obj.get("relationships", [])) + } + + def add_chunks( + self, + chunk_ids: List[str], + texts: List[str], + embeddings: List[List[float]], + chunk_objects: List[Dict] + ): + """ + Add chunks to ChromaDB with embeddings and metadata + + Args: + chunk_ids: List of unique chunk IDs + texts: List of chunk texts + embeddings: List of embedding vectors + chunk_objects: List of full chunk objects (with metadata, entities, relationships) + """ + if not chunk_ids: + logger.warning("️ No chunks to add") + return + + try: + # Flatten all metadata + metadatas = [self._flatten_metadata(obj) for obj in chunk_objects] + + # Add to ChromaDB + self.collection.add( + ids=chunk_ids, + embeddings=embeddings, + documents=texts, + metadatas=metadatas + ) + + logger.info(f" Added {len(chunk_ids)} chunks to ChromaDB") + + except Exception as e: + logger.error(f" Failed to add chunks to ChromaDB: {e}") + raise + + def search( + self, + query_embedding: List[float], + top_k: int = 10, + filters: Optional[Dict] = None + ) -> Dict: + """ + Search ChromaDB with optional metadata filters + + Args: + query_embedding: Query embedding vector + top_k: Number of results to return + filters: Optional filters like: + { + "source_page": "IIT_Kharagpur", + "category": "Academics", + "min_entity_count": 3 + } + + Returns: + Search results with ids, documents, metadatas, distances + """ + try: + # Build ChromaDB where clause + where = None + if filters: + where = {} + + # Exact match filters + if "source_page" in filters: + where["source_page"] = filters["source_page"] + + # Note: Category filtering is disabled because ChromaDB doesn't support + # substring/contains operators on comma-separated strings. + # To filter by category, you would need to do post-processing on results. + + # Numeric filters + if "min_entity_count" in filters: + where["entity_count"] = {"$gte": filters["min_entity_count"]} + + # Query ChromaDB + results = self.collection.query( + query_embeddings=[query_embedding], + n_results=top_k, + where=where if where else None + ) + + # Return flattened results + return { + "ids": results["ids"][0] if results["ids"] else [], + "documents": results["documents"][0] if results["documents"] else [], + "metadatas": results["metadatas"][0] if results["metadatas"] else [], + "distances": results["distances"][0] if results["distances"] else [] + } + + except Exception as e: + logger.error(f" ChromaDB search failed: {e}") + raise + + def get_count(self) -> int: + """Get total number of chunks in collection""" + return self.collection.count() + + def get_by_ids(self, chunk_ids: List[str]) -> Dict: + """ + Retrieve chunks by IDs + + Args: + chunk_ids: List of chunk IDs + + Returns: + Dictionary with ids, documents, metadatas + """ + try: + results = self.collection.get(ids=chunk_ids) + return results + except Exception as e: + logger.error(f" Failed to retrieve chunks: {e}") + return {"ids": [], "documents": [], "metadatas": []} + + def reset(self): + """Delete all data (for testing/reindexing)""" + logger.warning("️ Resetting ChromaDB collection") + self.client.delete_collection(self.collection_name) + self.collection = self.client.create_collection( + name=self.collection_name, + metadata={"hnsw:space": "cosine"} + ) diff --git a/submissions/team_2/chatbot/backend/src/utils/chunk_processor.py b/submissions/team_2/chatbot/backend/src/utils/chunk_processor.py new file mode 100644 index 0000000..ca0bb60 --- /dev/null +++ b/submissions/team_2/chatbot/backend/src/utils/chunk_processor.py @@ -0,0 +1,275 @@ +""" +Wiki Chunk Processor +Processes wiki pages into chunks with entity extraction and relationship mapping +""" + +import hashlib +import re +from typing import List, Dict +import spacy +import logging + +logger = logging.getLogger(__name__) + + +class WikiChunkProcessor: + """Process wiki pages into chunks with entities and relationships""" + + def __init__(self, chunk_size: int = 512, chunk_overlap: int = 50): + """ + Initialize chunk processor + + Args: + chunk_size: Target chunk size in words + chunk_overlap: Overlap between chunks in words + """ + self.chunk_size = chunk_size + self.chunk_overlap = chunk_overlap + + # Load spaCy model for entity extraction + try: + self.nlp = spacy.load("en_core_web_sm") + logger.info(" Loaded spaCy model: en_core_web_sm") + except OSError: + logger.warning("️ spaCy model not found, downloading...") + import subprocess + subprocess.run(["python", "-m", "spacy", "download", "en_core_web_sm"]) + self.nlp = spacy.load("en_core_web_sm") + + def chunk_text(self, text: str) -> List[str]: + """ + Split text into overlapping chunks + + Args: + text: Input text + + Returns: + List of text chunks + """ + if not text or not text.strip(): + return [] + + words = text.split() + chunks = [] + + for i in range(0, len(words), self.chunk_size - self.chunk_overlap): + chunk_words = words[i:i + self.chunk_size] + chunk = ' '.join(chunk_words) + + if chunk.strip(): + chunks.append(chunk) + + # Stop if we've reached the end + if i + self.chunk_size >= len(words): + break + + return chunks + + def extract_entities(self, text: str, max_entities: int = 50) -> List[str]: + """ + Extract named entities using spaCy + + Args: + text: Input text + max_entities: Maximum number of entities to extract + + Returns: + List of unique entity strings + """ + if not text: + return [] + + try: + # Truncate very long texts to avoid memory issues + text_sample = text[:10000] + doc = self.nlp(text_sample) + + # Extract unique entities + entities = [] + seen = set() + + for ent in doc.ents: + entity_text = ent.text.strip() + if entity_text and entity_text.lower() not in seen: + entities.append(entity_text) + seen.add(entity_text.lower()) + + if len(entities) >= max_entities: + break + + return entities + + except Exception as e: + logger.error(f"Entity extraction failed: {e}") + return [] + + def extract_wiki_links( + self, + wiki_links: List[str], + chunk_text: str, + max_links: int = 20 + ) -> List[str]: + """ + Filter wiki links that appear in chunk text + + Args: + wiki_links: All wiki links from page + chunk_text: Current chunk text + max_links: Maximum links to return + + Returns: + List of relevant wiki links + """ + if not wiki_links: + return [] + + chunk_lower = chunk_text.lower() + relevant_links = [] + + for link in wiki_links: + # Normalize link (replace underscores with spaces) + normalized_link = link.replace('_', ' ').lower() + + # Check if link appears in chunk + if normalized_link in chunk_lower or link.lower() in chunk_lower: + relevant_links.append(link) + + if len(relevant_links) >= max_links: + break + + return relevant_links + + def build_relationships( + self, + entities: List[str], + wiki_links: List[str], + max_relationships: int = 30 + ) -> List[Dict[str, str]]: + """ + Build entity relationships from co-occurrence and wiki links + + Args: + entities: Extracted named entities + wiki_links: Relevant wiki links + max_relationships: Maximum relationships to create + + Returns: + List of relationship dictionaries + """ + relationships = [] + + # 1. Entity co-occurrence relationships (entities that appear together) + for i, ent1 in enumerate(entities[:10]): # Limit to top 10 entities + for ent2 in entities[i+1:i+6]: # Max 5 relationships per entity + relationships.append({ + "from": ent1, + "to": ent2, + "type": "co_occurs_with" + }) + + if len(relationships) >= max_relationships: + return relationships + + # 2. Entity to wiki-link relationships + for link in wiki_links[:10]: # Top 10 wiki links + for entity in entities[:5]: # Top 5 entities + relationships.append({ + "from": entity, + "to": link, + "type": "mentioned_in_page" + }) + + if len(relationships) >= max_relationships: + return relationships + + return relationships + + def process_page( + self, + page_name: str, + title: str, + cleaned_text: str, + categories: List[str], + links: List[str] + ) -> List[Dict]: + """ + Process a wiki page into chunks with metadata + + Args: + page_name: Unique page identifier + title: Display title + cleaned_text: Cleaned page content + categories: Page categories + links: Wiki links from page + + Returns: + List of chunk dictionaries in required format: + { + "chunk_id": "unique_id", + "text": "chunk content", + "metadata": { + "source_page": "page_name", + "title": "page_title", + "categories": ["cat1", "cat2"], + "chunk_index": 0, + "total_chunks": 5 + }, + "entities": ["entity1", "entity2"], + "relationships": [{"from": "e1", "to": "e2", "type": "related"}] + } + """ + if not cleaned_text or not cleaned_text.strip(): + logger.debug(f"Skipping empty page: {page_name}") + return [] + + # Split into chunks + chunks = self.chunk_text(cleaned_text) + total_chunks = len(chunks) + + if total_chunks == 0: + return [] + + processed_chunks = [] + + for idx, chunk_text in enumerate(chunks): + try: + # Generate unique chunk ID (deterministic hash) + chunk_id_source = f"{page_name}_{idx}_{chunk_text[:100]}" + chunk_id = hashlib.sha256(chunk_id_source.encode()).hexdigest()[:16] + + # Extract entities from this chunk + entities = self.extract_entities(chunk_text, max_entities=20) + + # Filter relevant wiki links for this chunk + relevant_links = self.extract_wiki_links(links, chunk_text, max_links=15) + + # Build relationships + relationships = self.build_relationships( + entities, + relevant_links, + max_relationships=25 + ) + + # Build chunk object + chunk_obj = { + "chunk_id": chunk_id, + "text": chunk_text, + "metadata": { + "source_page": page_name, + "title": title, + "categories": categories or [], + "chunk_index": idx, + "total_chunks": total_chunks + }, + "entities": entities, + "relationships": relationships + } + + processed_chunks.append(chunk_obj) + + except Exception as e: + logger.error(f"Failed to process chunk {idx} of {page_name}: {e}") + continue + + logger.debug(f"Processed {page_name}: {len(processed_chunks)} chunks") + return processed_chunks diff --git a/submissions/team_2/chatbot/backend/src/utils/embedding_client.py b/submissions/team_2/chatbot/backend/src/utils/embedding_client.py new file mode 100644 index 0000000..723042b --- /dev/null +++ b/submissions/team_2/chatbot/backend/src/utils/embedding_client.py @@ -0,0 +1,290 @@ +""" +Embedding Client for Modal Embedding Service +Features: +- Connection pooling with requests.Session +- Retry logic (1 retry, 0.2s backoff) +- Timeouts (5s connect, 30s read) +- Keep-alive headers +- Async support with httpx +""" + +import os +import logging +from typing import List, Optional, Dict +import requests +from requests.adapters import HTTPAdapter +from urllib3.util.retry import Retry +import time + +logger = logging.getLogger(__name__) + + +class ModalEmbeddingClient: + """Client for Modal embedding service with connection pooling and retry logic""" + + def __init__(self, modal_url: Optional[str] = None): + """ + Initialize embedding client + + Args: + modal_url: Modal embedding service URL (e.g., https://...modal.run) + If not provided, reads from MODAL_URL env variable + """ + self.modal_url = modal_url or os.getenv("MODAL_URL") + + if not self.modal_url: + raise ValueError("Modal URL not provided and MODAL_URL env variable not set") + + # Ensure URL ends without trailing slash + self.modal_url = self.modal_url.rstrip('/') + + # Create session with connection pooling + self.session = requests.Session() + + # Configure retry strategy + retry_strategy = Retry( + total=1, # 1 retry + backoff_factor=0.2, # 0.2s backoff + status_forcelist=[429, 500, 502, 503, 504], + allowed_methods=["POST", "GET"] + ) + + # Mount adapter with retry strategy + adapter = HTTPAdapter( + max_retries=retry_strategy, + pool_connections=10, + pool_maxsize=20 + ) + self.session.mount("http://", adapter) + self.session.mount("https://", adapter) + + # Set keep-alive headers + self.session.headers.update({ + "Connection": "keep-alive", + "Accept": "application/json", + "Content-Type": "application/json" + }) + + # Timeouts + self.connect_timeout = 5 + self.read_timeout = 30 + + logger.info(f" ModalEmbeddingClient initialized with URL: {self.modal_url}") + + def __call__(self, text: str) -> Optional[List[float]]: + """ + Generate embedding for a single text + + Args: + text: Text to embed + + Returns: + Embedding vector (list of floats) or None if failed + """ + if not text or not text.strip(): + logger.warning("Empty text provided for embedding") + return None + + try: + # Prepare request payload + payload = { + "doc_id": f"doc_{hash(text)}", + "content": text, + "metadata": {} + } + + # Make request with timeouts + response = self.session.post( + f"{self.modal_url}/embedding/embed", + json=payload, + timeout=(self.connect_timeout, self.read_timeout) + ) + + # Check response + response.raise_for_status() + + # Parse embeddings + result = response.json() + embeddings = result.get("embeddings", []) + + if embeddings and len(embeddings) > 0: + return embeddings[0] + else: + logger.error("No embeddings returned from service") + return None + + except requests.exceptions.Timeout: + logger.error("Embedding request timed out") + return None + + except requests.exceptions.RequestException as e: + logger.error(f"Embedding request failed: {e}") + return None + + except Exception as e: + logger.error(f"Unexpected error in embedding: {e}") + return None + + def embed_batch(self, texts: List[str]) -> List[Optional[List[float]]]: + """ + Generate embeddings for multiple texts + + Args: + texts: List of texts to embed + + Returns: + List of embedding vectors (or None for failed embeddings) + """ + embeddings = [] + + for text in texts: + embedding = self(text) + embeddings.append(embedding) + + # Brief pause to avoid rate limiting + time.sleep(0.01) + + return embeddings + + def health_check(self) -> bool: + """ + Check if embedding service is healthy + + Returns: + True if service is healthy, False otherwise + """ + try: + response = self.session.get( + f"{self.modal_url}/embedding/health", + timeout=(self.connect_timeout, 10) + ) + + response.raise_for_status() + result = response.json() + + is_healthy = result.get("status") == "ok" + + if is_healthy: + logger.info(f" Embedding service healthy (dimension: {result.get('embedding_dimension')})") + else: + logger.warning("️ Embedding service returned unhealthy status") + + return is_healthy + + except Exception as e: + logger.error(f" Health check failed: {e}") + return False + + def close(self): + """Close session and cleanup resources""" + self.session.close() + logger.info(" Embedding client session closed") + + def __enter__(self): + """Context manager entry""" + return self + + def __exit__(self, exc_type, exc_val, exc_tb): + """Context manager exit""" + self.close() + + +class AsyncModalEmbeddingClient: + """Async version of embedding client using httpx""" + + def __init__(self, modal_url: Optional[str] = None): + """ + Initialize async embedding client + + Args: + modal_url: Modal embedding service URL + """ + self.modal_url = modal_url or os.getenv("MODAL_URL") + + if not self.modal_url: + raise ValueError("Modal URL not provided and MODAL_URL env variable not set") + + self.modal_url = self.modal_url.rstrip('/') + + # httpx client will be created when needed + self._client = None + + self.connect_timeout = 5 + self.read_timeout = 30 + + logger.info(f" AsyncModalEmbeddingClient initialized with URL: {self.modal_url}") + + async def _get_client(self): + """Lazy initialization of httpx client""" + if self._client is None: + import httpx + + self._client = httpx.AsyncClient( + timeout=httpx.Timeout( + connect=self.connect_timeout, + read=self.read_timeout + ), + limits=httpx.Limits( + max_keepalive_connections=10, + max_connections=20 + ) + ) + + return self._client + + async def __call__(self, text: str) -> Optional[List[float]]: + """ + Generate embedding for a single text (async) + + Args: + text: Text to embed + + Returns: + Embedding vector or None if failed + """ + if not text or not text.strip(): + logger.warning("Empty text provided for embedding") + return None + + try: + client = await self._get_client() + + payload = { + "doc_id": f"doc_{hash(text)}", + "content": text, + "metadata": {} + } + + response = await client.post( + f"{self.modal_url}/embedding/embed", + json=payload + ) + + response.raise_for_status() + + result = response.json() + embeddings = result.get("embeddings", []) + + if embeddings and len(embeddings) > 0: + return embeddings[0] + else: + logger.error("No embeddings returned from service") + return None + + except Exception as e: + logger.error(f"Async embedding request failed: {e}") + return None + + async def close(self): + """Close async client""" + if self._client: + await self._client.aclose() + logger.info(" Async embedding client closed") + + async def __aenter__(self): + """Async context manager entry""" + return self + + async def __aexit__(self, exc_type, exc_val, exc_tb): + """Async context manager exit""" + await self.close() diff --git a/submissions/team_2/chatbot/backend/src/utils/got_utils.py b/submissions/team_2/chatbot/backend/src/utils/got_utils.py new file mode 100644 index 0000000..cd6100c --- /dev/null +++ b/submissions/team_2/chatbot/backend/src/utils/got_utils.py @@ -0,0 +1,340 @@ +""" +Graph of Thought (GoT) Utilities +NetworkX serialization, Pyvis visualization, semantic caching +""" + +import json +import logging +import math +from typing import Dict, List, Optional, Tuple +from pathlib import Path + +import networkx as nx +from pyvis.network import Network +import chromadb +from chromadb.config import Settings + +from src.utils.embedding_client import ModalEmbeddingClient + +logger = logging.getLogger(__name__) + + +class GoTCache: + """ + Semantic cache for verified thoughts using ChromaDB + """ + + def __init__(self, chroma_dir: str, embedding_client: ModalEmbeddingClient): + """ + Initialize the GoT cache + + Args: + chroma_dir: Directory for ChromaDB persistence + embedding_client: Modal embedding client for generating embeddings + """ + self.embedding_client = embedding_client + + # Initialize ChromaDB client + self.client = chromadb.PersistentClient( + path=chroma_dir, + settings=Settings(anonymized_telemetry=False) + ) + + # Get or create the verified thoughts collection + self.collection = self.client.get_or_create_collection( + name="verified_thoughts", + metadata={"hnsw:space": "cosine"} + ) + + logger.info(f"GoT Cache initialized with {self.collection.count()} verified thoughts") + + def check_cache(self, query: str, threshold: float = 0.1) -> Optional[Dict]: + """ + Check if a similar thought exists in the cache + + Args: + query: The sub-query or thought to check + threshold: Distance threshold (< 0.1 means very similar) + + Returns: + Cached result dict if found, None otherwise + """ + # Generate embedding for query + query_embedding = self.embedding_client(query) + + if not query_embedding: + return None + + # Search in cache collection + results = self.collection.query( + query_embeddings=[query_embedding], + n_results=1 + ) + + if not results['ids'] or len(results['ids'][0]) == 0: + return None + + # Check distance threshold + distance = results['distances'][0][0] + + if distance < threshold: + logger.info(f"Cache HIT! Distance: {distance:.4f}") + return { + "thought": results['documents'][0][0], + "sources": json.loads(results['metadatas'][0][0].get('sources', '[]')), + "verification_score": results['metadatas'][0][0].get('verification_score', 0.0), + "cached": True + } + + logger.info(f"Cache MISS. Distance: {distance:.4f}") + return None + + def add_to_cache(self, thought: str, sources: List[str], verification_score: float): + """ + Add a verified thought to the cache + + Args: + thought: The verified thought text + sources: List of source URLs + verification_score: Verification score from MoE + """ + # Generate embedding + embedding = self.embedding_client(thought) + + if not embedding: + logger.error("Failed to generate embedding for thought") + return + + # Generate unique ID + thought_id = f"thought_{self.collection.count() + 1:06d}" + + # Add to collection + self.collection.add( + ids=[thought_id], + embeddings=[embedding], + documents=[thought], + metadatas=[{ + "sources": json.dumps(sources), + "verification_score": verification_score + }] + ) + + logger.info(f"Added thought to cache: {thought_id}") + + +class NetworkXManager: + """ + Manages NetworkX graph serialization and operations + """ + + @staticmethod + def create_graph() -> nx.DiGraph: + """Create a new directed graph for GoT""" + return nx.DiGraph() + + @staticmethod + def add_node( + graph: nx.DiGraph, + node_id: str, + thought: str, + sources: List[str], + parent_nodes: List[str], + verification_score: float, + expert_remarks: str = "" + ) -> nx.DiGraph: + """ + Add a node to the graph + + Args: + graph: NetworkX DiGraph + node_id: Unique node identifier + thought: The thought/reasoning text + sources: List of source URLs + parent_nodes: List of parent node IDs + verification_score: Verification score from MoE + expert_remarks: Optional remarks from experts + + Returns: + Updated graph + """ + graph.add_node( + node_id, + thought=thought, + sources=sources, + parent_nodes=parent_nodes, + verification_score=verification_score, + expert_remarks=expert_remarks + ) + + # Add edges from parents + for parent_id in parent_nodes: + if parent_id in graph.nodes: + graph.add_edge(parent_id, node_id) + + return graph + + @staticmethod + def merge_nodes(graph: nx.DiGraph, node_id: str, parent_id: str) -> nx.DiGraph: + """ + Merge a redundant node with its parent + + Args: + graph: NetworkX DiGraph + node_id: Node to merge + parent_id: Parent node to merge into + + Returns: + Updated graph + """ + if node_id in graph.nodes and parent_id in graph.nodes: + # Merge thoughts + parent_thought = graph.nodes[parent_id].get('thought', '') + node_thought = graph.nodes[node_id].get('thought', '') + graph.nodes[parent_id]['thought'] = f"{parent_thought}\n\n{node_thought}" + + # Merge sources + parent_sources = set(graph.nodes[parent_id].get('sources', [])) + node_sources = set(graph.nodes[node_id].get('sources', [])) + graph.nodes[parent_id]['sources'] = list(parent_sources | node_sources) + + # Remove redundant node + graph.remove_node(node_id) + + logger.info(f"Merged node {node_id} into {parent_id}") + + return graph + + @staticmethod + def to_json(graph: nx.DiGraph) -> str: + """ + Serialize graph to JSON + + Args: + graph: NetworkX DiGraph + + Returns: + JSON string + """ + data = nx.node_link_data(graph) + return json.dumps(data, indent=2) + + @staticmethod + def from_json(json_str: str) -> nx.DiGraph: + """ + Deserialize graph from JSON + + Args: + json_str: JSON string + + Returns: + NetworkX DiGraph + """ + data = json.loads(json_str) + return nx.node_link_graph(data, directed=True) + + @staticmethod + def get_best_path(graph: nx.DiGraph, start_node: str, end_node: str) -> List[str]: + """ + Get the best path between two nodes based on verification scores + + Args: + graph: NetworkX DiGraph + start_node: Starting node ID + end_node: Ending node ID + + Returns: + List of node IDs in the best path + """ + try: + # Weight edges inversely by verification score (lower weight = better) + for u, v in graph.edges(): + score = graph.nodes[v].get('verification_score', 0.5) + graph[u][v]['weight'] = 1.0 / (score + 0.01) # Avoid division by zero + + # Find shortest path (which is best path with our weighting) + path = nx.shortest_path(graph, start_node, end_node, weight='weight') + return path + + except nx.NetworkXNoPath: + logger.warning(f"No path found between {start_node} and {end_node}") + return [] + + +class PyvisVisualizer: + """ + Converts NetworkX graph to interactive Pyvis HTML + """ + + @staticmethod + def generate_html(graph: nx.DiGraph, output_path: Optional[str] = None) -> str: + """ + Generate interactive HTML visualization of the graph + + Args: + graph: NetworkX DiGraph + output_path: Optional path to save HTML file + + Returns: + HTML string + """ + # Create Pyvis network + net = Network( + height="750px", + width="100%", + directed=True, + bgcolor="#222222", + font_color="white" + ) + + # Set physics layout + net.barnes_hut( + gravity=-8000, + central_gravity=0.3, + spring_length=200, + spring_strength=0.001, + damping=0.09 + ) + + # Add nodes with styling based on verification score + for node_id in graph.nodes(): + node_data = graph.nodes[node_id] + thought = node_data.get('thought', '') + score = node_data.get('verification_score', 0.0) + sources = node_data.get('sources', []) + + # Color based on verification score + if score >= 0.9: + color = "#00ff00" # Green + elif score >= 0.7: + color = "#ffff00" # Yellow + else: + color = "#ff0000" # Red + + # Create hover title with details + title = f"{node_id}
" + title += f"Score: {score:.2f}
" + title += f"
{thought[:200]}...
" + if sources: + title += f"
Sources: {len(sources)}" + + net.add_node( + node_id, + label=f"{node_id}\n{score:.2f}", + title=title, + color=color, + size=20 + (score * 30) + ) + + # Add edges + for u, v in graph.edges(): + net.add_edge(u, v) + + # Generate HTML + if output_path: + net.save_graph(output_path) + with open(output_path, 'r') as f: + html = f.read() + else: + html = net.generate_html() + + return html diff --git a/submissions/team_2/chatbot/backend/src/utils/groq_client.py b/submissions/team_2/chatbot/backend/src/utils/groq_client.py new file mode 100644 index 0000000..abf7fae --- /dev/null +++ b/submissions/team_2/chatbot/backend/src/utils/groq_client.py @@ -0,0 +1,147 @@ +""" +Groq Client +Client for Groq API with high-performance models +""" + +import logging +from typing import List +import asyncio +import os + +from groq import Groq + +logger = logging.getLogger(__name__) + + +class GroqClient: + """ + Client for Groq models + Uses models with highest RPM for optimal performance + """ + + def __init__(self, api_key: str = None): + """ + Initialize the Groq client + + Args: + api_key: Groq API key (optional, will use GROQ_API_KEY env var if not provided) + """ + self.api_key = api_key or os.getenv("GROQ_API_KEY") + self.client = Groq(api_key=self.api_key) + + # Using Llama-4-Scout-17b-16e for both judge and expert reasoning + # meta-llama/llama-4-scout-17b-16e-instruct: High performance model with excellent reasoning + self.judge_model = "meta-llama/llama-4-scout-17b-16e-instruct" + + # For experts (fast verification), use the same model + self.expert_model = "meta-llama/llama-4-scout-17b-16e-instruct" + + logger.info(f"GroqClient initialized with models: judge={self.judge_model}, expert={self.expert_model}") + + async def generate_judge(self, prompt: str, max_tokens: int = 2048) -> str: + """ + Call Groq model for complex reasoning + + Args: + prompt: Input prompt + max_tokens: Maximum tokens to generate + + Returns: + Generated text + """ + try: + logger.info(f"Calling Judge model ({self.judge_model}) with prompt length: {len(prompt)}") + + # Run in executor since Groq SDK is sync + loop = asyncio.get_event_loop() + + def _generate(): + completion = self.client.chat.completions.create( + model=self.judge_model, + messages=[ + { + "role": "user", + "content": prompt + } + ], + temperature=0.7, + max_tokens=max_tokens, + top_p=0.9, + stream=False + ) + return completion.choices[0].message.content + + text = await loop.run_in_executor(None, _generate) + logger.info(f"Judge model generated {len(text)} chars") + return text + + except Exception as e: + logger.error(f"Judge model error: {e}") + raise + + async def generate_expert(self, prompt: str, max_tokens: int = 1024) -> str: + """ + Call Groq model for fast verification + + Args: + prompt: Input prompt + max_tokens: Maximum tokens to generate + + Returns: + Generated text + """ + try: + logger.info(f"Calling Expert model ({self.expert_model}) with prompt length: {len(prompt)}") + + # Run in executor since Groq SDK is sync + loop = asyncio.get_event_loop() + + def _generate(): + completion = self.client.chat.completions.create( + model=self.expert_model, + messages=[ + { + "role": "user", + "content": prompt + } + ], + temperature=0.7, + max_tokens=max_tokens, + top_p=0.9, + stream=False + ) + return completion.choices[0].message.content + + text = await loop.run_in_executor(None, _generate) + logger.info(f"Expert model generated {len(text)} chars") + return text + + except Exception as e: + logger.error(f"Expert model error: {e}") + raise + + async def batch_generate_expert(self, prompts: List[str], max_tokens: int = 1024) -> List[str]: + """ + Batch generate using Expert model + + Args: + prompts: List of input prompts + max_tokens: Maximum tokens to generate per prompt + + Returns: + List of generated texts + """ + try: + logger.info(f"Batch calling Expert model with {len(prompts)} prompts") + + results = [] + for prompt in prompts: + text = await self.generate_expert(prompt, max_tokens) + results.append(text) + + logger.info(f"Expert model batch generated {len(results)} responses") + return results + + except Exception as e: + logger.error(f"Expert batch error: {e}") + raise diff --git a/submissions/team_2/chatbot/backend/src/utils/modal_embeddings.py b/submissions/team_2/chatbot/backend/src/utils/modal_embeddings.py new file mode 100644 index 0000000..706ff77 --- /dev/null +++ b/submissions/team_2/chatbot/backend/src/utils/modal_embeddings.py @@ -0,0 +1,91 @@ +""" +Modal Embedding Service for MetaKGP RAG System +Deploys sentence-transformers/all-MiniLM-L6-v2 (384 dimensions) on Modal +""" + +import modal +from typing import List, Dict + +app = modal.App("metakgp-embeddings") + +image = modal.Image.debian_slim(python_version="3.11").pip_install( + "sentence-transformers==3.0.1", + "torch==2.4.1", + "transformers==4.45.0", + "fastapi==0.109.0", + "pydantic==2.5.3", + "huggingface-hub>=0.20.0", + "numpy>=1.24.0" +) + +@app.function( + image=image, + gpu="A100", + min_containers=1, + timeout=300, + scaledown_window=120 +) +def embed_batch(texts: List[str]) -> List[List[float]]: + """Batch embed texts using sentence-transformers""" + from sentence_transformers import SentenceTransformer + + model = SentenceTransformer('sentence-transformers/all-MiniLM-L6-v2') + embeddings = model.encode(texts, batch_size=500, show_progress_bar=False) + return embeddings.tolist() + +@app.function(image=image) +def get_embedding_dimension() -> int: + """Return embedding dimension""" + return 384 + +@app.function(image=image) +@modal.asgi_app() +def fastapi_app(): + from fastapi import FastAPI + from pydantic import BaseModel + + web_app = FastAPI() + + class EmbedRequest(BaseModel): + doc_id: str + content: str + metadata: Dict = {} + + class HealthResponse(BaseModel): + status: str + embedding_dimension: int + + @web_app.post("/embedding/embed") + async def embed(request: EmbedRequest): + """ + Expected request format: + { + "doc_id": "unique_id", + "content": "text to embed", + "metadata": {"source": "..."} + } + + Returns: + { + "embeddings": [[float, float, ...]] + } + """ + # Call the embed_batch function + embeddings = embed_batch.remote([request.content]) + return {"embeddings": embeddings} + + @web_app.get("/embedding/health") + async def health(): + """ + Returns: + { + "status": "ok", + "embedding_dimension": 384 + } + """ + return HealthResponse( + status="ok", + embedding_dimension=384 + ) + + return web_app diff --git a/submissions/team_2/chatbot/backend/src/utils/wiki_pages.py b/submissions/team_2/chatbot/backend/src/utils/wiki_pages.py new file mode 100644 index 0000000..e69de29 diff --git a/submissions/team_2/chatbot/backend/start.sh b/submissions/team_2/chatbot/backend/start.sh new file mode 100755 index 0000000..1121c89 --- /dev/null +++ b/submissions/team_2/chatbot/backend/start.sh @@ -0,0 +1,92 @@ +#!/bin/bash +# Start MetaKGP RAG System + FastAPI Service + +set -e + +echo " Starting MetaKGP RAG + FastAPI Server" + +# Get the team_2 root directory (5 levels up from start.sh) +SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)" +TEAM_2_ROOT="$(cd "$SCRIPT_DIR/../../" && pwd)" +ENV_FILE="$TEAM_2_ROOT/.env" + +# Load environment variables from unified .env at team_2 root +if [ -f "$ENV_FILE" ]; then + echo " Loading .env from: $ENV_FILE" + export $(cat "$ENV_FILE" | grep -v '^#' | xargs) +else + echo " .env file not found at: $ENV_FILE" + echo " Please create one from: $TEAM_2_ROOT/.env.example" + exit 1 +fi + +# Validate required environment variables +# Check if database config is provided (either individual components or full URL) +if [ -z "$DB_HOST" ] && [ -z "$DATABASE_URL" ]; then + echo " Database configuration not set in .env" + echo " Set either DB_HOST, DB_NAME, DB_USER, DB_PASSWORD or DATABASE_URL" + exit 1 +fi + +if [ -z "$MODAL_URL" ]; then + echo " MODAL_URL not set in .env" + exit 1 +fi + +# Activate virtual environment if it exists +if [ -d ".venv" ]; then + source .venv/bin/activate +elif [ -d "venv" ]; then + source venv/bin/activate +else + echo " Virtual environment not found. Please run: uv venv && uv pip install -e ." + exit 1 +fi + +# Create necessary directories +mkdir -p cache chroma_data logs + +# Get absolute paths for consistency +BACKEND_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)" +CHROMA_ABS_PATH="${BACKEND_DIR}/${CHROMA_DIR:-chroma_data}" +CACHE_ABS_PATH="${BACKEND_DIR}/${CACHE_DIR:-cache}" + +echo "" +echo " Starting RAG Indexer..." +python src/services/indexing/indexer.py \ + --chroma-dir "$CHROMA_ABS_PATH" \ + --cache-dir "$CACHE_ABS_PATH" \ + --batch-size "${BATCH_SIZE:-100}" \ + > logs/indexer.log 2>&1 & + +INDEXER_PID=$! +echo " Indexer started (PID: $INDEXER_PID)" + +# Wait for indexer to initialize +sleep 2 + +echo "" +echo " Starting FastAPI Server..." +# Set CHROMA_DIR and CACHE_DIR as environment variables for query service +export CHROMA_DIR="$CHROMA_ABS_PATH" +export CACHE_DIR="$CACHE_ABS_PATH" +uvicorn src.app.main:app \ + --host "${HOST:-0.0.0.0}" \ + --port "${PORT:-8000}" \ + --log-level info \ + > logs/fastapi_server.log 2>&1 & + +FastAPI_PID=$! +echo " FastAPI Server started (PID: $FastAPI_PID)" + +echo "" +echo " All services started successfully!" +echo "" +echo " Logs:" +echo " Indexer: tail -f logs/indexer.log" +echo " FastAPI Server: tail -f logs/fastapi_server.log" +echo "" +echo " API:" +echo " Docs: http://localhost:${PORT:-8000}/docs" +echo "" +echo "Stop: ./stop.sh" diff --git a/submissions/team_2/chatbot/backend/stop.sh b/submissions/team_2/chatbot/backend/stop.sh new file mode 100755 index 0000000..8b314f3 --- /dev/null +++ b/submissions/team_2/chatbot/backend/stop.sh @@ -0,0 +1,51 @@ +#!/bin/bash +# Stop MetaKGP RAG System - Indexer + FastAPI Query Service + +set -e + +echo "Stopping MetaKGP RAG System" + +# Clean up processes by name +echo "" +echo " Stopping services..." + +STOPPED=0 + +# Stop Indexer +INDEXER_PIDS=$(pgrep -f "services/indexing/indexer.py" 2>/dev/null || true) +if [ ! -z "$INDEXER_PIDS" ]; then + echo " Stopping Indexer (PID: $INDEXER_PIDS)..." + kill $INDEXER_PIDS 2>/dev/null || true + sleep 2 + + # Force kill if still running + if pgrep -f "services/indexing/indexer.py" > /dev/null 2>&1; then + echo " Force killing..." + pkill -9 -f "services/indexing/indexer.py" 2>/dev/null || true + fi + echo " Indexer stopped" + STOPPED=$((STOPPED + 1)) +fi + +# Stop FastAPI Server +FastAPI_PIDS=$(pgrep -f "uvicorn.*src.app.main:app" 2>/dev/null || true) +if [ ! -z "$FastAPI_PIDS" ]; then + echo " Stopping FastAPI Server (PID: $FastAPI_PIDS)..." + kill $FastAPI_PIDS 2>/dev/null || true + sleep 2 + + # Force kill if still running + if pgrep -f "uvicorn.*src.app.main:app" > /dev/null 2>&1; then + echo " Force killing..." + pkill -9 -f "uvicorn.*src.app.main:app" 2>/dev/null || true + fi + echo " FastAPI Server stopped" + STOPPED=$((STOPPED + 1)) +fi + +echo "" +if [ $STOPPED -gt 0 ]; then + echo " Stopped $STOPPED service(s)" +else + echo "No services were running" +fi \ No newline at end of file diff --git a/submissions/team_2/chatbot/frontend/.gitignore b/submissions/team_2/chatbot/frontend/.gitignore new file mode 100644 index 0000000..a547bf3 --- /dev/null +++ b/submissions/team_2/chatbot/frontend/.gitignore @@ -0,0 +1,24 @@ +# Logs +logs +*.log +npm-debug.log* +yarn-debug.log* +yarn-error.log* +pnpm-debug.log* +lerna-debug.log* + +node_modules +dist +dist-ssr +*.local + +# Editor directories and files +.vscode/* +!.vscode/extensions.json +.idea +.DS_Store +*.suo +*.ntvs* +*.njsproj +*.sln +*.sw? diff --git a/submissions/team_2/chatbot/frontend/README.md b/submissions/team_2/chatbot/frontend/README.md new file mode 100644 index 0000000..7642bae --- /dev/null +++ b/submissions/team_2/chatbot/frontend/README.md @@ -0,0 +1,29 @@ +# GraphMind Frontend + +A React-based frontend for the GraphMind chatbot interface for IIT Kharagpur. + +## Setup + +1. Install dependencies: +```bash +npm install +``` + +2. Run the development server: +```bash +npm run dev +``` + +3. Build for production: +```bash +npm run build +``` + +## Features + +- Clean, modern chat interface +- Real-time messaging with loading states +- Example prompts for quick queries +- Integration with backend API at `http://localhost:8000/got/query` +- Responsive design for mobile and desktop +- Dark theme matching IIT Kharagpur branding diff --git a/submissions/team_2/chatbot/frontend/index.html b/submissions/team_2/chatbot/frontend/index.html new file mode 100644 index 0000000..ec482ab --- /dev/null +++ b/submissions/team_2/chatbot/frontend/index.html @@ -0,0 +1,13 @@ + + + + + + + GraphMind - IIT Kharagpur + + +
+ + + diff --git a/submissions/team_2/chatbot/frontend/package-lock.json b/submissions/team_2/chatbot/frontend/package-lock.json new file mode 100644 index 0000000..eaeecfc --- /dev/null +++ b/submissions/team_2/chatbot/frontend/package-lock.json @@ -0,0 +1,6636 @@ +{ + "name": "metakgp-chatbot-frontend", + "version": "1.0.0", + "lockfileVersion": 3, + "requires": true, + "packages": { + "": { + "name": "metakgp-chatbot-frontend", + "version": "1.0.0", + "dependencies": { + "react": "^18.3.1", + "react-dom": "^18.3.1", + "react-markdown": "^10.1.0", + "tailwindcss": "^4.1.18" + }, + "devDependencies": { + "@tailwindcss/vite": "^4.1.18", + "@types/react": "^18.3.18", + "@types/react-dom": "^18.3.5", + "@vitejs/plugin-react": "^4.3.4", + "autoprefixer": "^10.4.20", + "eslint": "^9.18.0", + "eslint-plugin-react": "^7.37.2", + "eslint-plugin-react-hooks": "^5.1.0", + "eslint-plugin-react-refresh": "^0.4.16", + "vite": "^6.0.7" + } + }, + "node_modules/@babel/code-frame": { + "version": "7.28.6", + "resolved": "https://registry.npmjs.org/@babel/code-frame/-/code-frame-7.28.6.tgz", + "integrity": "sha512-JYgintcMjRiCvS8mMECzaEn+m3PfoQiyqukOMCCVQtoJGYJw8j/8LBJEiqkHLkfwCcs74E3pbAUFNg7d9VNJ+Q==", + "dev": true, + "license": "MIT", + "dependencies": { + "@babel/helper-validator-identifier": "^7.28.5", + "js-tokens": "^4.0.0", + "picocolors": "^1.1.1" + }, + "engines": { + "node": ">=6.9.0" + } + }, + "node_modules/@babel/compat-data": { + "version": "7.28.6", + "resolved": "https://registry.npmjs.org/@babel/compat-data/-/compat-data-7.28.6.tgz", + "integrity": "sha512-2lfu57JtzctfIrcGMz992hyLlByuzgIk58+hhGCxjKZ3rWI82NnVLjXcaTqkI2NvlcvOskZaiZ5kjUALo3Lpxg==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">=6.9.0" + } + }, + "node_modules/@babel/core": { + "version": "7.28.6", + "resolved": "https://registry.npmjs.org/@babel/core/-/core-7.28.6.tgz", + "integrity": "sha512-H3mcG6ZDLTlYfaSNi0iOKkigqMFvkTKlGUYlD8GW7nNOYRrevuA46iTypPyv+06V3fEmvvazfntkBU34L0azAw==", + "dev": true, + "license": "MIT", + "dependencies": { + "@babel/code-frame": "^7.28.6", + "@babel/generator": "^7.28.6", + "@babel/helper-compilation-targets": "^7.28.6", + "@babel/helper-module-transforms": "^7.28.6", + "@babel/helpers": "^7.28.6", + "@babel/parser": "^7.28.6", + "@babel/template": "^7.28.6", + "@babel/traverse": "^7.28.6", + "@babel/types": "^7.28.6", + "@jridgewell/remapping": "^2.3.5", + "convert-source-map": "^2.0.0", + "debug": "^4.1.0", + "gensync": "^1.0.0-beta.2", + "json5": "^2.2.3", + "semver": "^6.3.1" + }, + "engines": { + "node": ">=6.9.0" + }, + "funding": { + "type": "opencollective", + "url": "https://opencollective.com/babel" + } + }, + "node_modules/@babel/generator": { + "version": "7.28.6", + "resolved": "https://registry.npmjs.org/@babel/generator/-/generator-7.28.6.tgz", + "integrity": "sha512-lOoVRwADj8hjf7al89tvQ2a1lf53Z+7tiXMgpZJL3maQPDxh0DgLMN62B2MKUOFcoodBHLMbDM6WAbKgNy5Suw==", + "dev": true, + "license": "MIT", + "dependencies": { + "@babel/parser": "^7.28.6", + "@babel/types": "^7.28.6", + "@jridgewell/gen-mapping": "^0.3.12", + "@jridgewell/trace-mapping": "^0.3.28", + "jsesc": "^3.0.2" + }, + "engines": { + "node": ">=6.9.0" + } + }, + "node_modules/@babel/helper-compilation-targets": { + "version": "7.28.6", + "resolved": "https://registry.npmjs.org/@babel/helper-compilation-targets/-/helper-compilation-targets-7.28.6.tgz", + "integrity": "sha512-JYtls3hqi15fcx5GaSNL7SCTJ2MNmjrkHXg4FSpOA/grxK8KwyZ5bubHsCq8FXCkua6xhuaaBit+3b7+VZRfcA==", + "dev": true, + "license": "MIT", + "dependencies": { + "@babel/compat-data": "^7.28.6", + "@babel/helper-validator-option": "^7.27.1", + "browserslist": "^4.24.0", + "lru-cache": "^5.1.1", + "semver": "^6.3.1" + }, + "engines": { + "node": ">=6.9.0" + } + }, + "node_modules/@babel/helper-globals": { + "version": "7.28.0", + "resolved": "https://registry.npmjs.org/@babel/helper-globals/-/helper-globals-7.28.0.tgz", + "integrity": "sha512-+W6cISkXFa1jXsDEdYA8HeevQT/FULhxzR99pxphltZcVaugps53THCeiWA8SguxxpSp3gKPiuYfSWopkLQ4hw==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">=6.9.0" + } + }, + "node_modules/@babel/helper-module-imports": { + "version": "7.28.6", + "resolved": "https://registry.npmjs.org/@babel/helper-module-imports/-/helper-module-imports-7.28.6.tgz", + "integrity": "sha512-l5XkZK7r7wa9LucGw9LwZyyCUscb4x37JWTPz7swwFE/0FMQAGpiWUZn8u9DzkSBWEcK25jmvubfpw2dnAMdbw==", + "dev": true, + "license": "MIT", + "dependencies": { + "@babel/traverse": "^7.28.6", + "@babel/types": "^7.28.6" + }, + "engines": { + "node": ">=6.9.0" + } + }, + "node_modules/@babel/helper-module-transforms": { + "version": "7.28.6", + "resolved": "https://registry.npmjs.org/@babel/helper-module-transforms/-/helper-module-transforms-7.28.6.tgz", + "integrity": "sha512-67oXFAYr2cDLDVGLXTEABjdBJZ6drElUSI7WKp70NrpyISso3plG9SAGEF6y7zbha/wOzUByWWTJvEDVNIUGcA==", + "dev": true, + "license": "MIT", + "dependencies": { + "@babel/helper-module-imports": "^7.28.6", + "@babel/helper-validator-identifier": "^7.28.5", + "@babel/traverse": "^7.28.6" + }, + "engines": { + "node": ">=6.9.0" + }, + "peerDependencies": { + "@babel/core": "^7.0.0" + } + }, + "node_modules/@babel/helper-plugin-utils": { + "version": "7.28.6", + "resolved": "https://registry.npmjs.org/@babel/helper-plugin-utils/-/helper-plugin-utils-7.28.6.tgz", + "integrity": "sha512-S9gzZ/bz83GRysI7gAD4wPT/AI3uCnY+9xn+Mx/KPs2JwHJIz1W8PZkg2cqyt3RNOBM8ejcXhV6y8Og7ly/Dug==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">=6.9.0" + } + }, + "node_modules/@babel/helper-string-parser": { + "version": "7.27.1", + "resolved": "https://registry.npmjs.org/@babel/helper-string-parser/-/helper-string-parser-7.27.1.tgz", + "integrity": "sha512-qMlSxKbpRlAridDExk92nSobyDdpPijUq2DW6oDnUqd0iOGxmQjyqhMIihI9+zv4LPyZdRje2cavWPbCbWm3eA==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">=6.9.0" + } + }, + "node_modules/@babel/helper-validator-identifier": { + "version": "7.28.5", + "resolved": "https://registry.npmjs.org/@babel/helper-validator-identifier/-/helper-validator-identifier-7.28.5.tgz", + "integrity": "sha512-qSs4ifwzKJSV39ucNjsvc6WVHs6b7S03sOh2OcHF9UHfVPqWWALUsNUVzhSBiItjRZoLHx7nIarVjqKVusUZ1Q==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">=6.9.0" + } + }, + "node_modules/@babel/helper-validator-option": { + "version": "7.27.1", + "resolved": "https://registry.npmjs.org/@babel/helper-validator-option/-/helper-validator-option-7.27.1.tgz", + "integrity": "sha512-YvjJow9FxbhFFKDSuFnVCe2WxXk1zWc22fFePVNEaWJEu8IrZVlda6N0uHwzZrUM1il7NC9Mlp4MaJYbYd9JSg==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">=6.9.0" + } + }, + "node_modules/@babel/helpers": { + "version": "7.28.6", + "resolved": "https://registry.npmjs.org/@babel/helpers/-/helpers-7.28.6.tgz", + "integrity": "sha512-xOBvwq86HHdB7WUDTfKfT/Vuxh7gElQ+Sfti2Cy6yIWNW05P8iUslOVcZ4/sKbE+/jQaukQAdz/gf3724kYdqw==", + "dev": true, + "license": "MIT", + "dependencies": { + "@babel/template": "^7.28.6", + "@babel/types": "^7.28.6" + }, + "engines": { + "node": ">=6.9.0" + } + }, + "node_modules/@babel/parser": { + "version": "7.28.6", + "resolved": "https://registry.npmjs.org/@babel/parser/-/parser-7.28.6.tgz", + "integrity": "sha512-TeR9zWR18BvbfPmGbLampPMW+uW1NZnJlRuuHso8i87QZNq2JRF9i6RgxRqtEq+wQGsS19NNTWr2duhnE49mfQ==", + "dev": true, + "license": "MIT", + "dependencies": { + "@babel/types": "^7.28.6" + }, + "bin": { + "parser": "bin/babel-parser.js" + }, + "engines": { + "node": ">=6.0.0" + } + }, + "node_modules/@babel/plugin-transform-react-jsx-self": { + "version": "7.27.1", + "resolved": "https://registry.npmjs.org/@babel/plugin-transform-react-jsx-self/-/plugin-transform-react-jsx-self-7.27.1.tgz", + "integrity": "sha512-6UzkCs+ejGdZ5mFFC/OCUrv028ab2fp1znZmCZjAOBKiBK2jXD1O+BPSfX8X2qjJ75fZBMSnQn3Rq2mrBJK2mw==", + "dev": true, + "license": "MIT", + "dependencies": { + "@babel/helper-plugin-utils": "^7.27.1" + }, + "engines": { + "node": ">=6.9.0" + }, + "peerDependencies": { + "@babel/core": "^7.0.0-0" + } + }, + "node_modules/@babel/plugin-transform-react-jsx-source": { + "version": "7.27.1", + "resolved": "https://registry.npmjs.org/@babel/plugin-transform-react-jsx-source/-/plugin-transform-react-jsx-source-7.27.1.tgz", + "integrity": "sha512-zbwoTsBruTeKB9hSq73ha66iFeJHuaFkUbwvqElnygoNbj/jHRsSeokowZFN3CZ64IvEqcmmkVe89OPXc7ldAw==", + "dev": true, + "license": "MIT", + "dependencies": { + "@babel/helper-plugin-utils": "^7.27.1" + }, + "engines": { + "node": ">=6.9.0" + }, + "peerDependencies": { + "@babel/core": "^7.0.0-0" + } + }, + "node_modules/@babel/template": { + "version": "7.28.6", + "resolved": "https://registry.npmjs.org/@babel/template/-/template-7.28.6.tgz", + "integrity": "sha512-YA6Ma2KsCdGb+WC6UpBVFJGXL58MDA6oyONbjyF/+5sBgxY/dwkhLogbMT2GXXyU84/IhRw/2D1Os1B/giz+BQ==", + "dev": true, + "license": "MIT", + "dependencies": { + "@babel/code-frame": "^7.28.6", + "@babel/parser": "^7.28.6", + "@babel/types": "^7.28.6" + }, + "engines": { + "node": ">=6.9.0" + } + }, + "node_modules/@babel/traverse": { + "version": "7.28.6", + "resolved": "https://registry.npmjs.org/@babel/traverse/-/traverse-7.28.6.tgz", + "integrity": "sha512-fgWX62k02qtjqdSNTAGxmKYY/7FSL9WAS1o2Hu5+I5m9T0yxZzr4cnrfXQ/MX0rIifthCSs6FKTlzYbJcPtMNg==", + "dev": true, + "license": "MIT", + "dependencies": { + "@babel/code-frame": "^7.28.6", + "@babel/generator": "^7.28.6", + "@babel/helper-globals": "^7.28.0", + "@babel/parser": "^7.28.6", + "@babel/template": "^7.28.6", + "@babel/types": "^7.28.6", + "debug": "^4.3.1" + }, + "engines": { + "node": ">=6.9.0" + } + }, + "node_modules/@babel/types": { + "version": "7.28.6", + "resolved": "https://registry.npmjs.org/@babel/types/-/types-7.28.6.tgz", + "integrity": "sha512-0ZrskXVEHSWIqZM/sQZ4EV3jZJXRkio/WCxaqKZP1g//CEWEPSfeZFcms4XeKBCHU0ZKnIkdJeU/kF+eRp5lBg==", + "dev": true, + "license": "MIT", + "dependencies": { + "@babel/helper-string-parser": "^7.27.1", + "@babel/helper-validator-identifier": "^7.28.5" + }, + "engines": { + "node": ">=6.9.0" + } + }, + "node_modules/@esbuild/aix-ppc64": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/aix-ppc64/-/aix-ppc64-0.25.12.tgz", + "integrity": "sha512-Hhmwd6CInZ3dwpuGTF8fJG6yoWmsToE+vYgD4nytZVxcu1ulHpUQRAB1UJ8+N1Am3Mz4+xOByoQoSZf4D+CpkA==", + "cpu": [ + "ppc64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "aix" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/android-arm": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/android-arm/-/android-arm-0.25.12.tgz", + "integrity": "sha512-VJ+sKvNA/GE7Ccacc9Cha7bpS8nyzVv0jdVgwNDaR4gDMC/2TTRc33Ip8qrNYUcpkOHUT5OZ0bUcNNVZQ9RLlg==", + "cpu": [ + "arm" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "android" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/android-arm64": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/android-arm64/-/android-arm64-0.25.12.tgz", + "integrity": "sha512-6AAmLG7zwD1Z159jCKPvAxZd4y/VTO0VkprYy+3N2FtJ8+BQWFXU+OxARIwA46c5tdD9SsKGZ/1ocqBS/gAKHg==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "android" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/android-x64": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/android-x64/-/android-x64-0.25.12.tgz", + "integrity": "sha512-5jbb+2hhDHx5phYR2By8GTWEzn6I9UqR11Kwf22iKbNpYrsmRB18aX/9ivc5cabcUiAT/wM+YIZ6SG9QO6a8kg==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "android" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/darwin-arm64": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/darwin-arm64/-/darwin-arm64-0.25.12.tgz", + "integrity": "sha512-N3zl+lxHCifgIlcMUP5016ESkeQjLj/959RxxNYIthIg+CQHInujFuXeWbWMgnTo4cp5XVHqFPmpyu9J65C1Yg==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "darwin" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/darwin-x64": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/darwin-x64/-/darwin-x64-0.25.12.tgz", + "integrity": "sha512-HQ9ka4Kx21qHXwtlTUVbKJOAnmG1ipXhdWTmNXiPzPfWKpXqASVcWdnf2bnL73wgjNrFXAa3yYvBSd9pzfEIpA==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "darwin" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/freebsd-arm64": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/freebsd-arm64/-/freebsd-arm64-0.25.12.tgz", + "integrity": "sha512-gA0Bx759+7Jve03K1S0vkOu5Lg/85dou3EseOGUes8flVOGxbhDDh/iZaoek11Y8mtyKPGF3vP8XhnkDEAmzeg==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "freebsd" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/freebsd-x64": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/freebsd-x64/-/freebsd-x64-0.25.12.tgz", + "integrity": "sha512-TGbO26Yw2xsHzxtbVFGEXBFH0FRAP7gtcPE7P5yP7wGy7cXK2oO7RyOhL5NLiqTlBh47XhmIUXuGciXEqYFfBQ==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "freebsd" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/linux-arm": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/linux-arm/-/linux-arm-0.25.12.tgz", + "integrity": "sha512-lPDGyC1JPDou8kGcywY0YILzWlhhnRjdof3UlcoqYmS9El818LLfJJc3PXXgZHrHCAKs/Z2SeZtDJr5MrkxtOw==", + "cpu": [ + "arm" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/linux-arm64": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/linux-arm64/-/linux-arm64-0.25.12.tgz", + "integrity": "sha512-8bwX7a8FghIgrupcxb4aUmYDLp8pX06rGh5HqDT7bB+8Rdells6mHvrFHHW2JAOPZUbnjUpKTLg6ECyzvas2AQ==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/linux-ia32": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/linux-ia32/-/linux-ia32-0.25.12.tgz", + "integrity": "sha512-0y9KrdVnbMM2/vG8KfU0byhUN+EFCny9+8g202gYqSSVMonbsCfLjUO+rCci7pM0WBEtz+oK/PIwHkzxkyharA==", + "cpu": [ + "ia32" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/linux-loong64": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/linux-loong64/-/linux-loong64-0.25.12.tgz", + "integrity": "sha512-h///Lr5a9rib/v1GGqXVGzjL4TMvVTv+s1DPoxQdz7l/AYv6LDSxdIwzxkrPW438oUXiDtwM10o9PmwS/6Z0Ng==", + "cpu": [ + "loong64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/linux-mips64el": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/linux-mips64el/-/linux-mips64el-0.25.12.tgz", + "integrity": "sha512-iyRrM1Pzy9GFMDLsXn1iHUm18nhKnNMWscjmp4+hpafcZjrr2WbT//d20xaGljXDBYHqRcl8HnxbX6uaA/eGVw==", + "cpu": [ + "mips64el" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/linux-ppc64": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/linux-ppc64/-/linux-ppc64-0.25.12.tgz", + "integrity": "sha512-9meM/lRXxMi5PSUqEXRCtVjEZBGwB7P/D4yT8UG/mwIdze2aV4Vo6U5gD3+RsoHXKkHCfSxZKzmDssVlRj1QQA==", + "cpu": [ + "ppc64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/linux-riscv64": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/linux-riscv64/-/linux-riscv64-0.25.12.tgz", + "integrity": "sha512-Zr7KR4hgKUpWAwb1f3o5ygT04MzqVrGEGXGLnj15YQDJErYu/BGg+wmFlIDOdJp0PmB0lLvxFIOXZgFRrdjR0w==", + "cpu": [ + "riscv64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/linux-s390x": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/linux-s390x/-/linux-s390x-0.25.12.tgz", + "integrity": "sha512-MsKncOcgTNvdtiISc/jZs/Zf8d0cl/t3gYWX8J9ubBnVOwlk65UIEEvgBORTiljloIWnBzLs4qhzPkJcitIzIg==", + "cpu": [ + "s390x" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/linux-x64": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/linux-x64/-/linux-x64-0.25.12.tgz", + "integrity": "sha512-uqZMTLr/zR/ed4jIGnwSLkaHmPjOjJvnm6TVVitAa08SLS9Z0VM8wIRx7gWbJB5/J54YuIMInDquWyYvQLZkgw==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/netbsd-arm64": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/netbsd-arm64/-/netbsd-arm64-0.25.12.tgz", + "integrity": "sha512-xXwcTq4GhRM7J9A8Gv5boanHhRa/Q9KLVmcyXHCTaM4wKfIpWkdXiMog/KsnxzJ0A1+nD+zoecuzqPmCRyBGjg==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "netbsd" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/netbsd-x64": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/netbsd-x64/-/netbsd-x64-0.25.12.tgz", + "integrity": "sha512-Ld5pTlzPy3YwGec4OuHh1aCVCRvOXdH8DgRjfDy/oumVovmuSzWfnSJg+VtakB9Cm0gxNO9BzWkj6mtO1FMXkQ==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "netbsd" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/openbsd-arm64": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/openbsd-arm64/-/openbsd-arm64-0.25.12.tgz", + "integrity": "sha512-fF96T6KsBo/pkQI950FARU9apGNTSlZGsv1jZBAlcLL1MLjLNIWPBkj5NlSz8aAzYKg+eNqknrUJ24QBybeR5A==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "openbsd" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/openbsd-x64": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/openbsd-x64/-/openbsd-x64-0.25.12.tgz", + "integrity": "sha512-MZyXUkZHjQxUvzK7rN8DJ3SRmrVrke8ZyRusHlP+kuwqTcfWLyqMOE3sScPPyeIXN/mDJIfGXvcMqCgYKekoQw==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "openbsd" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/openharmony-arm64": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/openharmony-arm64/-/openharmony-arm64-0.25.12.tgz", + "integrity": "sha512-rm0YWsqUSRrjncSXGA7Zv78Nbnw4XL6/dzr20cyrQf7ZmRcsovpcRBdhD43Nuk3y7XIoW2OxMVvwuRvk9XdASg==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "openharmony" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/sunos-x64": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/sunos-x64/-/sunos-x64-0.25.12.tgz", + "integrity": "sha512-3wGSCDyuTHQUzt0nV7bocDy72r2lI33QL3gkDNGkod22EsYl04sMf0qLb8luNKTOmgF/eDEDP5BFNwoBKH441w==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "sunos" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/win32-arm64": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/win32-arm64/-/win32-arm64-0.25.12.tgz", + "integrity": "sha512-rMmLrur64A7+DKlnSuwqUdRKyd3UE7oPJZmnljqEptesKM8wx9J8gx5u0+9Pq0fQQW8vqeKebwNXdfOyP+8Bsg==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "win32" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/win32-ia32": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/win32-ia32/-/win32-ia32-0.25.12.tgz", + "integrity": "sha512-HkqnmmBoCbCwxUKKNPBixiWDGCpQGVsrQfJoVGYLPT41XWF8lHuE5N6WhVia2n4o5QK5M4tYr21827fNhi4byQ==", + "cpu": [ + "ia32" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "win32" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/win32-x64": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/win32-x64/-/win32-x64-0.25.12.tgz", + "integrity": "sha512-alJC0uCZpTFrSL0CCDjcgleBXPnCrEAhTBILpeAp7M/OFgoqtAetfBzX0xM00MUsVVPpVjlPuMbREqnZCXaTnA==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "win32" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@eslint-community/eslint-utils": { + "version": "4.9.1", + "resolved": "https://registry.npmjs.org/@eslint-community/eslint-utils/-/eslint-utils-4.9.1.tgz", + "integrity": "sha512-phrYmNiYppR7znFEdqgfWHXR6NCkZEK7hwWDHZUjit/2/U0r6XvkDl0SYnoM51Hq7FhCGdLDT6zxCCOY1hexsQ==", + "dev": true, + "license": "MIT", + "dependencies": { + "eslint-visitor-keys": "^3.4.3" + }, + "engines": { + "node": "^12.22.0 || ^14.17.0 || >=16.0.0" + }, + "funding": { + "url": "https://opencollective.com/eslint" + }, + "peerDependencies": { + "eslint": "^6.0.0 || ^7.0.0 || >=8.0.0" + } + }, + "node_modules/@eslint-community/eslint-utils/node_modules/eslint-visitor-keys": { + "version": "3.4.3", + "resolved": "https://registry.npmjs.org/eslint-visitor-keys/-/eslint-visitor-keys-3.4.3.tgz", + "integrity": "sha512-wpc+LXeiyiisxPlEkUzU6svyS1frIO3Mgxj1fdy7Pm8Ygzguax2N3Fa/D/ag1WqbOprdI+uY6wMUl8/a2G+iag==", + "dev": true, + "license": "Apache-2.0", + "engines": { + "node": "^12.22.0 || ^14.17.0 || >=16.0.0" + }, + "funding": { + "url": "https://opencollective.com/eslint" + } + }, + "node_modules/@eslint-community/regexpp": { + "version": "4.12.2", + "resolved": "https://registry.npmjs.org/@eslint-community/regexpp/-/regexpp-4.12.2.tgz", + "integrity": "sha512-EriSTlt5OC9/7SXkRSCAhfSxxoSUgBm33OH+IkwbdpgoqsSsUg7y3uh+IICI/Qg4BBWr3U2i39RpmycbxMq4ew==", + "dev": true, + "license": "MIT", + "engines": { + "node": "^12.0.0 || ^14.0.0 || >=16.0.0" + } + }, + "node_modules/@eslint/config-array": { + "version": "0.21.1", + "resolved": "https://registry.npmjs.org/@eslint/config-array/-/config-array-0.21.1.tgz", + "integrity": "sha512-aw1gNayWpdI/jSYVgzN5pL0cfzU02GT3NBpeT/DXbx1/1x7ZKxFPd9bwrzygx/qiwIQiJ1sw/zD8qY/kRvlGHA==", + "dev": true, + "license": "Apache-2.0", + "dependencies": { + "@eslint/object-schema": "^2.1.7", + "debug": "^4.3.1", + "minimatch": "^3.1.2" + }, + "engines": { + "node": "^18.18.0 || ^20.9.0 || >=21.1.0" + } + }, + "node_modules/@eslint/config-helpers": { + "version": "0.4.2", + "resolved": "https://registry.npmjs.org/@eslint/config-helpers/-/config-helpers-0.4.2.tgz", + "integrity": "sha512-gBrxN88gOIf3R7ja5K9slwNayVcZgK6SOUORm2uBzTeIEfeVaIhOpCtTox3P6R7o2jLFwLFTLnC7kU/RGcYEgw==", + "dev": true, + "license": "Apache-2.0", + "dependencies": { + "@eslint/core": "^0.17.0" + }, + "engines": { + "node": "^18.18.0 || ^20.9.0 || >=21.1.0" + } + }, + "node_modules/@eslint/core": { + "version": "0.17.0", + "resolved": "https://registry.npmjs.org/@eslint/core/-/core-0.17.0.tgz", + "integrity": "sha512-yL/sLrpmtDaFEiUj1osRP4TI2MDz1AddJL+jZ7KSqvBuliN4xqYY54IfdN8qD8Toa6g1iloph1fxQNkjOxrrpQ==", + "dev": true, + "license": "Apache-2.0", + "dependencies": { + "@types/json-schema": "^7.0.15" + }, + "engines": { + "node": "^18.18.0 || ^20.9.0 || >=21.1.0" + } + }, + "node_modules/@eslint/eslintrc": { + "version": "3.3.3", + "resolved": "https://registry.npmjs.org/@eslint/eslintrc/-/eslintrc-3.3.3.tgz", + "integrity": "sha512-Kr+LPIUVKz2qkx1HAMH8q1q6azbqBAsXJUxBl/ODDuVPX45Z9DfwB8tPjTi6nNZ8BuM3nbJxC5zCAg5elnBUTQ==", + "dev": true, + "license": "MIT", + "dependencies": { + "ajv": "^6.12.4", + "debug": "^4.3.2", + "espree": "^10.0.1", + "globals": "^14.0.0", + "ignore": "^5.2.0", + "import-fresh": "^3.2.1", + "js-yaml": "^4.1.1", + "minimatch": "^3.1.2", + "strip-json-comments": "^3.1.1" + }, + "engines": { + "node": "^18.18.0 || ^20.9.0 || >=21.1.0" + }, + "funding": { + "url": "https://opencollective.com/eslint" + } + }, + "node_modules/@eslint/js": { + "version": "9.39.2", + "resolved": "https://registry.npmjs.org/@eslint/js/-/js-9.39.2.tgz", + "integrity": "sha512-q1mjIoW1VX4IvSocvM/vbTiveKC4k9eLrajNEuSsmjymSDEbpGddtpfOoN7YGAqBK3NG+uqo8ia4PDTt8buCYA==", + "dev": true, + "license": "MIT", + "engines": { + "node": "^18.18.0 || ^20.9.0 || >=21.1.0" + }, + "funding": { + "url": "https://eslint.org/donate" + } + }, + "node_modules/@eslint/object-schema": { + "version": "2.1.7", + "resolved": "https://registry.npmjs.org/@eslint/object-schema/-/object-schema-2.1.7.tgz", + "integrity": "sha512-VtAOaymWVfZcmZbp6E2mympDIHvyjXs/12LqWYjVw6qjrfF+VK+fyG33kChz3nnK+SU5/NeHOqrTEHS8sXO3OA==", + "dev": true, + "license": "Apache-2.0", + "engines": { + "node": "^18.18.0 || ^20.9.0 || >=21.1.0" + } + }, + "node_modules/@eslint/plugin-kit": { + "version": "0.4.1", + "resolved": "https://registry.npmjs.org/@eslint/plugin-kit/-/plugin-kit-0.4.1.tgz", + "integrity": "sha512-43/qtrDUokr7LJqoF2c3+RInu/t4zfrpYdoSDfYyhg52rwLV6TnOvdG4fXm7IkSB3wErkcmJS9iEhjVtOSEjjA==", + "dev": true, + "license": "Apache-2.0", + "dependencies": { + "@eslint/core": "^0.17.0", + "levn": "^0.4.1" + }, + "engines": { + "node": "^18.18.0 || ^20.9.0 || >=21.1.0" + } + }, + "node_modules/@humanfs/core": { + "version": "0.19.1", + "resolved": "https://registry.npmjs.org/@humanfs/core/-/core-0.19.1.tgz", + "integrity": "sha512-5DyQ4+1JEUzejeK1JGICcideyfUbGixgS9jNgex5nqkW+cY7WZhxBigmieN5Qnw9ZosSNVC9KQKyb+GUaGyKUA==", + "dev": true, + "license": "Apache-2.0", + "engines": { + "node": ">=18.18.0" + } + }, + "node_modules/@humanfs/node": { + "version": "0.16.7", + "resolved": "https://registry.npmjs.org/@humanfs/node/-/node-0.16.7.tgz", + "integrity": "sha512-/zUx+yOsIrG4Y43Eh2peDeKCxlRt/gET6aHfaKpuq267qXdYDFViVHfMaLyygZOnl0kGWxFIgsBy8QFuTLUXEQ==", + "dev": true, + "license": "Apache-2.0", + "dependencies": { + "@humanfs/core": "^0.19.1", + "@humanwhocodes/retry": "^0.4.0" + }, + "engines": { + "node": ">=18.18.0" + } + }, + "node_modules/@humanwhocodes/module-importer": { + "version": "1.0.1", + "resolved": "https://registry.npmjs.org/@humanwhocodes/module-importer/-/module-importer-1.0.1.tgz", + "integrity": "sha512-bxveV4V8v5Yb4ncFTT3rPSgZBOpCkjfK0y4oVVVJwIuDVBRMDXrPyXRL988i5ap9m9bnyEEjWfm5WkBmtffLfA==", + "dev": true, + "license": "Apache-2.0", + "engines": { + "node": ">=12.22" + }, + "funding": { + "type": "github", + "url": "https://github.com/sponsors/nzakas" + } + }, + "node_modules/@humanwhocodes/retry": { + "version": "0.4.3", + "resolved": "https://registry.npmjs.org/@humanwhocodes/retry/-/retry-0.4.3.tgz", + "integrity": "sha512-bV0Tgo9K4hfPCek+aMAn81RppFKv2ySDQeMoSZuvTASywNTnVJCArCZE2FWqpvIatKu7VMRLWlR1EazvVhDyhQ==", + "dev": true, + "license": "Apache-2.0", + "engines": { + "node": ">=18.18" + }, + "funding": { + "type": "github", + "url": "https://github.com/sponsors/nzakas" + } + }, + "node_modules/@jridgewell/gen-mapping": { + "version": "0.3.13", + "resolved": "https://registry.npmjs.org/@jridgewell/gen-mapping/-/gen-mapping-0.3.13.tgz", + "integrity": "sha512-2kkt/7niJ6MgEPxF0bYdQ6etZaA+fQvDcLKckhy1yIQOzaoKjBBjSj63/aLVjYE3qhRt5dvM+uUyfCg6UKCBbA==", + "dev": true, + "license": "MIT", + "dependencies": { + "@jridgewell/sourcemap-codec": "^1.5.0", + "@jridgewell/trace-mapping": "^0.3.24" + } + }, + "node_modules/@jridgewell/remapping": { + "version": "2.3.5", + "resolved": "https://registry.npmjs.org/@jridgewell/remapping/-/remapping-2.3.5.tgz", + "integrity": "sha512-LI9u/+laYG4Ds1TDKSJW2YPrIlcVYOwi2fUC6xB43lueCjgxV4lffOCZCtYFiH6TNOX+tQKXx97T4IKHbhyHEQ==", + "dev": true, + "license": "MIT", + "dependencies": { + "@jridgewell/gen-mapping": "^0.3.5", + "@jridgewell/trace-mapping": "^0.3.24" + } + }, + "node_modules/@jridgewell/resolve-uri": { + "version": "3.1.2", + "resolved": "https://registry.npmjs.org/@jridgewell/resolve-uri/-/resolve-uri-3.1.2.tgz", + "integrity": "sha512-bRISgCIjP20/tbWSPWMEi54QVPRZExkuD9lJL+UIxUKtwVJA8wW1Trb1jMs1RFXo1CBTNZ/5hpC9QvmKWdopKw==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">=6.0.0" + } + }, + "node_modules/@jridgewell/sourcemap-codec": { + "version": "1.5.5", + "resolved": "https://registry.npmjs.org/@jridgewell/sourcemap-codec/-/sourcemap-codec-1.5.5.tgz", + "integrity": "sha512-cYQ9310grqxueWbl+WuIUIaiUaDcj7WOq5fVhEljNVgRfOUhY9fy2zTvfoqWsnebh8Sl70VScFbICvJnLKB0Og==", + "dev": true, + "license": "MIT" + }, + "node_modules/@jridgewell/trace-mapping": { + "version": "0.3.31", + "resolved": "https://registry.npmjs.org/@jridgewell/trace-mapping/-/trace-mapping-0.3.31.tgz", + "integrity": "sha512-zzNR+SdQSDJzc8joaeP8QQoCQr8NuYx2dIIytl1QeBEZHJ9uW6hebsrYgbz8hJwUQao3TWCMtmfV8Nu1twOLAw==", + "dev": true, + "license": "MIT", + "dependencies": { + "@jridgewell/resolve-uri": "^3.1.0", + "@jridgewell/sourcemap-codec": "^1.4.14" + } + }, + "node_modules/@rolldown/pluginutils": { + "version": "1.0.0-beta.27", + "resolved": "https://registry.npmjs.org/@rolldown/pluginutils/-/pluginutils-1.0.0-beta.27.tgz", + "integrity": "sha512-+d0F4MKMCbeVUJwG96uQ4SgAznZNSq93I3V+9NHA4OpvqG8mRCpGdKmK8l/dl02h2CCDHwW2FqilnTyDcAnqjA==", + "dev": true, + "license": "MIT" + }, + "node_modules/@rollup/rollup-android-arm-eabi": { + "version": "4.55.1", + "resolved": "https://registry.npmjs.org/@rollup/rollup-android-arm-eabi/-/rollup-android-arm-eabi-4.55.1.tgz", + "integrity": "sha512-9R0DM/ykwfGIlNu6+2U09ga0WXeZ9MRC2Ter8jnz8415VbuIykVuc6bhdrbORFZANDmTDvq26mJrEVTl8TdnDg==", + "cpu": [ + "arm" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "android" + ] + }, + "node_modules/@rollup/rollup-android-arm64": { + "version": "4.55.1", + "resolved": "https://registry.npmjs.org/@rollup/rollup-android-arm64/-/rollup-android-arm64-4.55.1.tgz", + "integrity": "sha512-eFZCb1YUqhTysgW3sj/55du5cG57S7UTNtdMjCW7LwVcj3dTTcowCsC8p7uBdzKsZYa8J7IDE8lhMI+HX1vQvg==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "android" + ] + }, + "node_modules/@rollup/rollup-darwin-arm64": { + "version": "4.55.1", + "resolved": "https://registry.npmjs.org/@rollup/rollup-darwin-arm64/-/rollup-darwin-arm64-4.55.1.tgz", + "integrity": "sha512-p3grE2PHcQm2e8PSGZdzIhCKbMCw/xi9XvMPErPhwO17vxtvCN5FEA2mSLgmKlCjHGMQTP6phuQTYWUnKewwGg==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "darwin" + ] + }, + "node_modules/@rollup/rollup-darwin-x64": { + "version": "4.55.1", + "resolved": "https://registry.npmjs.org/@rollup/rollup-darwin-x64/-/rollup-darwin-x64-4.55.1.tgz", + "integrity": "sha512-rDUjG25C9qoTm+e02Esi+aqTKSBYwVTaoS1wxcN47/Luqef57Vgp96xNANwt5npq9GDxsH7kXxNkJVEsWEOEaQ==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "darwin" + ] + }, + "node_modules/@rollup/rollup-freebsd-arm64": { + "version": "4.55.1", + "resolved": "https://registry.npmjs.org/@rollup/rollup-freebsd-arm64/-/rollup-freebsd-arm64-4.55.1.tgz", + "integrity": "sha512-+JiU7Jbp5cdxekIgdte0jfcu5oqw4GCKr6i3PJTlXTCU5H5Fvtkpbs4XJHRmWNXF+hKmn4v7ogI5OQPaupJgOg==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "freebsd" + ] + }, + "node_modules/@rollup/rollup-freebsd-x64": { + "version": "4.55.1", + "resolved": "https://registry.npmjs.org/@rollup/rollup-freebsd-x64/-/rollup-freebsd-x64-4.55.1.tgz", + "integrity": "sha512-V5xC1tOVWtLLmr3YUk2f6EJK4qksksOYiz/TCsFHu/R+woubcLWdC9nZQmwjOAbmExBIVKsm1/wKmEy4z4u4Bw==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "freebsd" + ] + }, + "node_modules/@rollup/rollup-linux-arm-gnueabihf": { + "version": "4.55.1", + "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-arm-gnueabihf/-/rollup-linux-arm-gnueabihf-4.55.1.tgz", + "integrity": "sha512-Rn3n+FUk2J5VWx+ywrG/HGPTD9jXNbicRtTM11e/uorplArnXZYsVifnPPqNNP5BsO3roI4n8332ukpY/zN7rQ==", + "cpu": [ + "arm" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ] + }, + "node_modules/@rollup/rollup-linux-arm-musleabihf": { + "version": "4.55.1", + "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-arm-musleabihf/-/rollup-linux-arm-musleabihf-4.55.1.tgz", + "integrity": "sha512-grPNWydeKtc1aEdrJDWk4opD7nFtQbMmV7769hiAaYyUKCT1faPRm2av8CX1YJsZ4TLAZcg9gTR1KvEzoLjXkg==", + "cpu": [ + "arm" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ] + }, + "node_modules/@rollup/rollup-linux-arm64-gnu": { + "version": "4.55.1", + "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-arm64-gnu/-/rollup-linux-arm64-gnu-4.55.1.tgz", + "integrity": "sha512-a59mwd1k6x8tXKcUxSyISiquLwB5pX+fJW9TkWU46lCqD/GRDe9uDN31jrMmVP3feI3mhAdvcCClhV8V5MhJFQ==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ] + }, + "node_modules/@rollup/rollup-linux-arm64-musl": { + "version": "4.55.1", + "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-arm64-musl/-/rollup-linux-arm64-musl-4.55.1.tgz", + "integrity": "sha512-puS1MEgWX5GsHSoiAsF0TYrpomdvkaXm0CofIMG5uVkP6IBV+ZO9xhC5YEN49nsgYo1DuuMquF9+7EDBVYu4uA==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ] + }, + "node_modules/@rollup/rollup-linux-loong64-gnu": { + "version": "4.55.1", + "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-loong64-gnu/-/rollup-linux-loong64-gnu-4.55.1.tgz", + "integrity": "sha512-r3Wv40in+lTsULSb6nnoudVbARdOwb2u5fpeoOAZjFLznp6tDU8kd+GTHmJoqZ9lt6/Sys33KdIHUaQihFcu7g==", + "cpu": [ + "loong64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ] + }, + "node_modules/@rollup/rollup-linux-loong64-musl": { + "version": "4.55.1", + "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-loong64-musl/-/rollup-linux-loong64-musl-4.55.1.tgz", + "integrity": "sha512-MR8c0+UxAlB22Fq4R+aQSPBayvYa3+9DrwG/i1TKQXFYEaoW3B5b/rkSRIypcZDdWjWnpcvxbNaAJDcSbJU3Lw==", + "cpu": [ + "loong64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ] + }, + "node_modules/@rollup/rollup-linux-ppc64-gnu": { + "version": "4.55.1", + "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-ppc64-gnu/-/rollup-linux-ppc64-gnu-4.55.1.tgz", + "integrity": "sha512-3KhoECe1BRlSYpMTeVrD4sh2Pw2xgt4jzNSZIIPLFEsnQn9gAnZagW9+VqDqAHgm1Xc77LzJOo2LdigS5qZ+gw==", + "cpu": [ + "ppc64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ] + }, + "node_modules/@rollup/rollup-linux-ppc64-musl": { + "version": "4.55.1", + "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-ppc64-musl/-/rollup-linux-ppc64-musl-4.55.1.tgz", + "integrity": "sha512-ziR1OuZx0vdYZZ30vueNZTg73alF59DicYrPViG0NEgDVN8/Jl87zkAPu4u6VjZST2llgEUjaiNl9JM6HH1Vdw==", + "cpu": [ + "ppc64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ] + }, + "node_modules/@rollup/rollup-linux-riscv64-gnu": { + "version": "4.55.1", + "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-riscv64-gnu/-/rollup-linux-riscv64-gnu-4.55.1.tgz", + "integrity": "sha512-uW0Y12ih2XJRERZ4jAfKamTyIHVMPQnTZcQjme2HMVDAHY4amf5u414OqNYC+x+LzRdRcnIG1YodLrrtA8xsxw==", + "cpu": [ + "riscv64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ] + }, + "node_modules/@rollup/rollup-linux-riscv64-musl": { + "version": "4.55.1", + "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-riscv64-musl/-/rollup-linux-riscv64-musl-4.55.1.tgz", + "integrity": "sha512-u9yZ0jUkOED1BFrqu3BwMQoixvGHGZ+JhJNkNKY/hyoEgOwlqKb62qu+7UjbPSHYjiVy8kKJHvXKv5coH4wDeg==", + "cpu": [ + "riscv64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ] + }, + "node_modules/@rollup/rollup-linux-s390x-gnu": { + "version": "4.55.1", + "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-s390x-gnu/-/rollup-linux-s390x-gnu-4.55.1.tgz", + "integrity": "sha512-/0PenBCmqM4ZUd0190j7J0UsQ/1nsi735iPRakO8iPciE7BQ495Y6msPzaOmvx0/pn+eJVVlZrNrSh4WSYLxNg==", + "cpu": [ + "s390x" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ] + }, + "node_modules/@rollup/rollup-linux-x64-gnu": { + "version": "4.55.1", + "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-x64-gnu/-/rollup-linux-x64-gnu-4.55.1.tgz", + "integrity": "sha512-a8G4wiQxQG2BAvo+gU6XrReRRqj+pLS2NGXKm8io19goR+K8lw269eTrPkSdDTALwMmJp4th2Uh0D8J9bEV1vg==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ] + }, + "node_modules/@rollup/rollup-linux-x64-musl": { + "version": "4.55.1", + "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-x64-musl/-/rollup-linux-x64-musl-4.55.1.tgz", + "integrity": "sha512-bD+zjpFrMpP/hqkfEcnjXWHMw5BIghGisOKPj+2NaNDuVT+8Ds4mPf3XcPHuat1tz89WRL+1wbcxKY3WSbiT7w==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ] + }, + "node_modules/@rollup/rollup-openbsd-x64": { + "version": "4.55.1", + "resolved": "https://registry.npmjs.org/@rollup/rollup-openbsd-x64/-/rollup-openbsd-x64-4.55.1.tgz", + "integrity": "sha512-eLXw0dOiqE4QmvikfQ6yjgkg/xDM+MdU9YJuP4ySTibXU0oAvnEWXt7UDJmD4UkYialMfOGFPJnIHSe/kdzPxg==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "openbsd" + ] + }, + "node_modules/@rollup/rollup-openharmony-arm64": { + "version": "4.55.1", + "resolved": "https://registry.npmjs.org/@rollup/rollup-openharmony-arm64/-/rollup-openharmony-arm64-4.55.1.tgz", + "integrity": "sha512-xzm44KgEP11te3S2HCSyYf5zIzWmx3n8HDCc7EE59+lTcswEWNpvMLfd9uJvVX8LCg9QWG67Xt75AuHn4vgsXw==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "openharmony" + ] + }, + "node_modules/@rollup/rollup-win32-arm64-msvc": { + "version": "4.55.1", + "resolved": "https://registry.npmjs.org/@rollup/rollup-win32-arm64-msvc/-/rollup-win32-arm64-msvc-4.55.1.tgz", + "integrity": "sha512-yR6Bl3tMC/gBok5cz/Qi0xYnVbIxGx5Fcf/ca0eB6/6JwOY+SRUcJfI0OpeTpPls7f194as62thCt/2BjxYN8g==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "win32" + ] + }, + "node_modules/@rollup/rollup-win32-ia32-msvc": { + "version": "4.55.1", + "resolved": "https://registry.npmjs.org/@rollup/rollup-win32-ia32-msvc/-/rollup-win32-ia32-msvc-4.55.1.tgz", + "integrity": "sha512-3fZBidchE0eY0oFZBnekYCfg+5wAB0mbpCBuofh5mZuzIU/4jIVkbESmd2dOsFNS78b53CYv3OAtwqkZZmU5nA==", + "cpu": [ + "ia32" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "win32" + ] + }, + "node_modules/@rollup/rollup-win32-x64-gnu": { + "version": "4.55.1", + "resolved": "https://registry.npmjs.org/@rollup/rollup-win32-x64-gnu/-/rollup-win32-x64-gnu-4.55.1.tgz", + "integrity": "sha512-xGGY5pXj69IxKb4yv/POoocPy/qmEGhimy/FoTpTSVju3FYXUQQMFCaZZXJVidsmGxRioZAwpThl/4zX41gRKg==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "win32" + ] + }, + "node_modules/@rollup/rollup-win32-x64-msvc": { + "version": "4.55.1", + "resolved": "https://registry.npmjs.org/@rollup/rollup-win32-x64-msvc/-/rollup-win32-x64-msvc-4.55.1.tgz", + "integrity": "sha512-SPEpaL6DX4rmcXtnhdrQYgzQ5W2uW3SCJch88lB2zImhJRhIIK44fkUrgIV/Q8yUNfw5oyZ5vkeQsZLhCb06lw==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "win32" + ] + }, + "node_modules/@tailwindcss/node": { + "version": "4.1.18", + "resolved": "https://registry.npmjs.org/@tailwindcss/node/-/node-4.1.18.tgz", + "integrity": "sha512-DoR7U1P7iYhw16qJ49fgXUlry1t4CpXeErJHnQ44JgTSKMaZUdf17cfn5mHchfJ4KRBZRFA/Coo+MUF5+gOaCQ==", + "dev": true, + "license": "MIT", + "dependencies": { + "@jridgewell/remapping": "^2.3.4", + "enhanced-resolve": "^5.18.3", + "jiti": "^2.6.1", + "lightningcss": "1.30.2", + "magic-string": "^0.30.21", + "source-map-js": "^1.2.1", + "tailwindcss": "4.1.18" + } + }, + "node_modules/@tailwindcss/oxide": { + "version": "4.1.18", + "resolved": "https://registry.npmjs.org/@tailwindcss/oxide/-/oxide-4.1.18.tgz", + "integrity": "sha512-EgCR5tTS5bUSKQgzeMClT6iCY3ToqE1y+ZB0AKldj809QXk1Y+3jB0upOYZrn9aGIzPtUsP7sX4QQ4XtjBB95A==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">= 10" + }, + "optionalDependencies": { + "@tailwindcss/oxide-android-arm64": "4.1.18", + "@tailwindcss/oxide-darwin-arm64": "4.1.18", + "@tailwindcss/oxide-darwin-x64": "4.1.18", + "@tailwindcss/oxide-freebsd-x64": "4.1.18", + "@tailwindcss/oxide-linux-arm-gnueabihf": "4.1.18", + "@tailwindcss/oxide-linux-arm64-gnu": "4.1.18", + "@tailwindcss/oxide-linux-arm64-musl": "4.1.18", + "@tailwindcss/oxide-linux-x64-gnu": "4.1.18", + "@tailwindcss/oxide-linux-x64-musl": "4.1.18", + "@tailwindcss/oxide-wasm32-wasi": "4.1.18", + "@tailwindcss/oxide-win32-arm64-msvc": "4.1.18", + "@tailwindcss/oxide-win32-x64-msvc": "4.1.18" + } + }, + "node_modules/@tailwindcss/oxide-android-arm64": { + "version": "4.1.18", + "resolved": "https://registry.npmjs.org/@tailwindcss/oxide-android-arm64/-/oxide-android-arm64-4.1.18.tgz", + "integrity": "sha512-dJHz7+Ugr9U/diKJA0W6N/6/cjI+ZTAoxPf9Iz9BFRF2GzEX8IvXxFIi/dZBloVJX/MZGvRuFA9rqwdiIEZQ0Q==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "android" + ], + "engines": { + "node": ">= 10" + } + }, + "node_modules/@tailwindcss/oxide-darwin-arm64": { + "version": "4.1.18", + "resolved": "https://registry.npmjs.org/@tailwindcss/oxide-darwin-arm64/-/oxide-darwin-arm64-4.1.18.tgz", + "integrity": "sha512-Gc2q4Qhs660bhjyBSKgq6BYvwDz4G+BuyJ5H1xfhmDR3D8HnHCmT/BSkvSL0vQLy/nkMLY20PQ2OoYMO15Jd0A==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "darwin" + ], + "engines": { + "node": ">= 10" + } + }, + "node_modules/@tailwindcss/oxide-darwin-x64": { + "version": "4.1.18", + "resolved": "https://registry.npmjs.org/@tailwindcss/oxide-darwin-x64/-/oxide-darwin-x64-4.1.18.tgz", + "integrity": "sha512-FL5oxr2xQsFrc3X9o1fjHKBYBMD1QZNyc1Xzw/h5Qu4XnEBi3dZn96HcHm41c/euGV+GRiXFfh2hUCyKi/e+yw==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "darwin" + ], + "engines": { + "node": ">= 10" + } + }, + "node_modules/@tailwindcss/oxide-freebsd-x64": { + "version": "4.1.18", + "resolved": "https://registry.npmjs.org/@tailwindcss/oxide-freebsd-x64/-/oxide-freebsd-x64-4.1.18.tgz", + "integrity": "sha512-Fj+RHgu5bDodmV1dM9yAxlfJwkkWvLiRjbhuO2LEtwtlYlBgiAT4x/j5wQr1tC3SANAgD+0YcmWVrj8R9trVMA==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "freebsd" + ], + "engines": { + "node": ">= 10" + } + }, + "node_modules/@tailwindcss/oxide-linux-arm-gnueabihf": { + "version": "4.1.18", + "resolved": "https://registry.npmjs.org/@tailwindcss/oxide-linux-arm-gnueabihf/-/oxide-linux-arm-gnueabihf-4.1.18.tgz", + "integrity": "sha512-Fp+Wzk/Ws4dZn+LV2Nqx3IilnhH51YZoRaYHQsVq3RQvEl+71VGKFpkfHrLM/Li+kt5c0DJe/bHXK1eHgDmdiA==", + "cpu": [ + "arm" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": ">= 10" + } + }, + "node_modules/@tailwindcss/oxide-linux-arm64-gnu": { + "version": "4.1.18", + "resolved": "https://registry.npmjs.org/@tailwindcss/oxide-linux-arm64-gnu/-/oxide-linux-arm64-gnu-4.1.18.tgz", + "integrity": "sha512-S0n3jboLysNbh55Vrt7pk9wgpyTTPD0fdQeh7wQfMqLPM/Hrxi+dVsLsPrycQjGKEQk85Kgbx+6+QnYNiHalnw==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": ">= 10" + } + }, + "node_modules/@tailwindcss/oxide-linux-arm64-musl": { + "version": "4.1.18", + "resolved": "https://registry.npmjs.org/@tailwindcss/oxide-linux-arm64-musl/-/oxide-linux-arm64-musl-4.1.18.tgz", + "integrity": "sha512-1px92582HkPQlaaCkdRcio71p8bc8i/ap5807tPRDK/uw953cauQBT8c5tVGkOwrHMfc2Yh6UuxaH4vtTjGvHg==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": ">= 10" + } + }, + "node_modules/@tailwindcss/oxide-linux-x64-gnu": { + "version": "4.1.18", + "resolved": "https://registry.npmjs.org/@tailwindcss/oxide-linux-x64-gnu/-/oxide-linux-x64-gnu-4.1.18.tgz", + "integrity": "sha512-v3gyT0ivkfBLoZGF9LyHmts0Isc8jHZyVcbzio6Wpzifg/+5ZJpDiRiUhDLkcr7f/r38SWNe7ucxmGW3j3Kb/g==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": ">= 10" + } + }, + "node_modules/@tailwindcss/oxide-linux-x64-musl": { + "version": "4.1.18", + "resolved": "https://registry.npmjs.org/@tailwindcss/oxide-linux-x64-musl/-/oxide-linux-x64-musl-4.1.18.tgz", + "integrity": "sha512-bhJ2y2OQNlcRwwgOAGMY0xTFStt4/wyU6pvI6LSuZpRgKQwxTec0/3Scu91O8ir7qCR3AuepQKLU/kX99FouqQ==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": ">= 10" + } + }, + "node_modules/@tailwindcss/oxide-wasm32-wasi": { + "version": "4.1.18", + "resolved": "https://registry.npmjs.org/@tailwindcss/oxide-wasm32-wasi/-/oxide-wasm32-wasi-4.1.18.tgz", + "integrity": "sha512-LffYTvPjODiP6PT16oNeUQJzNVyJl1cjIebq/rWWBF+3eDst5JGEFSc5cWxyRCJ0Mxl+KyIkqRxk1XPEs9x8TA==", + "bundleDependencies": [ + "@napi-rs/wasm-runtime", + "@emnapi/core", + "@emnapi/runtime", + "@tybys/wasm-util", + "@emnapi/wasi-threads", + "tslib" + ], + "cpu": [ + "wasm32" + ], + "dev": true, + "license": "MIT", + "optional": true, + "dependencies": { + "@emnapi/core": "^1.7.1", + "@emnapi/runtime": "^1.7.1", + "@emnapi/wasi-threads": "^1.1.0", + "@napi-rs/wasm-runtime": "^1.1.0", + "@tybys/wasm-util": "^0.10.1", + "tslib": "^2.4.0" + }, + "engines": { + "node": ">=14.0.0" + } + }, + "node_modules/@tailwindcss/oxide-win32-arm64-msvc": { + "version": "4.1.18", + "resolved": "https://registry.npmjs.org/@tailwindcss/oxide-win32-arm64-msvc/-/oxide-win32-arm64-msvc-4.1.18.tgz", + "integrity": "sha512-HjSA7mr9HmC8fu6bdsZvZ+dhjyGCLdotjVOgLA2vEqxEBZaQo9YTX4kwgEvPCpRh8o4uWc4J/wEoFzhEmjvPbA==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "win32" + ], + "engines": { + "node": ">= 10" + } + }, + "node_modules/@tailwindcss/oxide-win32-x64-msvc": { + "version": "4.1.18", + "resolved": "https://registry.npmjs.org/@tailwindcss/oxide-win32-x64-msvc/-/oxide-win32-x64-msvc-4.1.18.tgz", + "integrity": "sha512-bJWbyYpUlqamC8dpR7pfjA0I7vdF6t5VpUGMWRkXVE3AXgIZjYUYAK7II1GNaxR8J1SSrSrppRar8G++JekE3Q==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "win32" + ], + "engines": { + "node": ">= 10" + } + }, + "node_modules/@tailwindcss/vite": { + "version": "4.1.18", + "resolved": "https://registry.npmjs.org/@tailwindcss/vite/-/vite-4.1.18.tgz", + "integrity": "sha512-jVA+/UpKL1vRLg6Hkao5jldawNmRo7mQYrZtNHMIVpLfLhDml5nMRUo/8MwoX2vNXvnaXNNMedrMfMugAVX1nA==", + "dev": true, + "license": "MIT", + "dependencies": { + "@tailwindcss/node": "4.1.18", + "@tailwindcss/oxide": "4.1.18", + "tailwindcss": "4.1.18" + }, + "peerDependencies": { + "vite": "^5.2.0 || ^6 || ^7" + } + }, + "node_modules/@types/babel__core": { + "version": "7.20.5", + "resolved": "https://registry.npmjs.org/@types/babel__core/-/babel__core-7.20.5.tgz", + "integrity": "sha512-qoQprZvz5wQFJwMDqeseRXWv3rqMvhgpbXFfVyWhbx9X47POIA6i/+dXefEmZKoAgOaTdaIgNSMqMIU61yRyzA==", + "dev": true, + "license": "MIT", + "dependencies": { + "@babel/parser": "^7.20.7", + "@babel/types": "^7.20.7", + "@types/babel__generator": "*", + "@types/babel__template": "*", + "@types/babel__traverse": "*" + } + }, + "node_modules/@types/babel__generator": { + "version": "7.27.0", + "resolved": "https://registry.npmjs.org/@types/babel__generator/-/babel__generator-7.27.0.tgz", + "integrity": "sha512-ufFd2Xi92OAVPYsy+P4n7/U7e68fex0+Ee8gSG9KX7eo084CWiQ4sdxktvdl0bOPupXtVJPY19zk6EwWqUQ8lg==", + "dev": true, + "license": "MIT", + "dependencies": { + "@babel/types": "^7.0.0" + } + }, + "node_modules/@types/babel__template": { + "version": "7.4.4", + "resolved": "https://registry.npmjs.org/@types/babel__template/-/babel__template-7.4.4.tgz", + "integrity": "sha512-h/NUaSyG5EyxBIp8YRxo4RMe2/qQgvyowRwVMzhYhBCONbW8PUsg4lkFMrhgZhUe5z3L3MiLDuvyJ/CaPa2A8A==", + "dev": true, + "license": "MIT", + "dependencies": { + "@babel/parser": "^7.1.0", + "@babel/types": "^7.0.0" + } + }, + "node_modules/@types/babel__traverse": { + "version": "7.28.0", + "resolved": "https://registry.npmjs.org/@types/babel__traverse/-/babel__traverse-7.28.0.tgz", + "integrity": "sha512-8PvcXf70gTDZBgt9ptxJ8elBeBjcLOAcOtoO/mPJjtji1+CdGbHgm77om1GrsPxsiE+uXIpNSK64UYaIwQXd4Q==", + "dev": true, + "license": "MIT", + "dependencies": { + "@babel/types": "^7.28.2" + } + }, + "node_modules/@types/debug": { + "version": "4.1.12", + "resolved": "https://registry.npmjs.org/@types/debug/-/debug-4.1.12.tgz", + "integrity": "sha512-vIChWdVG3LG1SMxEvI/AK+FWJthlrqlTu7fbrlywTkkaONwk/UAGaULXRlf8vkzFBLVm0zkMdCquhL5aOjhXPQ==", + "license": "MIT", + "dependencies": { + "@types/ms": "*" + } + }, + "node_modules/@types/estree": { + "version": "1.0.8", + "resolved": "https://registry.npmjs.org/@types/estree/-/estree-1.0.8.tgz", + "integrity": "sha512-dWHzHa2WqEXI/O1E9OjrocMTKJl2mSrEolh1Iomrv6U+JuNwaHXsXx9bLu5gG7BUWFIN0skIQJQ/L1rIex4X6w==", + "license": "MIT" + }, + "node_modules/@types/estree-jsx": { + "version": "1.0.5", + "resolved": "https://registry.npmjs.org/@types/estree-jsx/-/estree-jsx-1.0.5.tgz", + "integrity": "sha512-52CcUVNFyfb1A2ALocQw/Dd1BQFNmSdkuC3BkZ6iqhdMfQz7JWOFRuJFloOzjk+6WijU56m9oKXFAXc7o3Towg==", + "license": "MIT", + "dependencies": { + "@types/estree": "*" + } + }, + "node_modules/@types/hast": { + "version": "3.0.4", + "resolved": "https://registry.npmjs.org/@types/hast/-/hast-3.0.4.tgz", + "integrity": "sha512-WPs+bbQw5aCj+x6laNGWLH3wviHtoCv/P3+otBhbOhJgG8qtpdAMlTCxLtsTWA7LH1Oh/bFCHsBn0TPS5m30EQ==", + "license": "MIT", + "dependencies": { + "@types/unist": "*" + } + }, + "node_modules/@types/json-schema": { + "version": "7.0.15", + "resolved": "https://registry.npmjs.org/@types/json-schema/-/json-schema-7.0.15.tgz", + "integrity": "sha512-5+fP8P8MFNC+AyZCDxrB2pkZFPGzqQWUzpSeuuVLvm8VMcorNYavBqoFcxK8bQz4Qsbn4oUEEem4wDLfcysGHA==", + "dev": true, + "license": "MIT" + }, + "node_modules/@types/mdast": { + "version": "4.0.4", + "resolved": "https://registry.npmjs.org/@types/mdast/-/mdast-4.0.4.tgz", + "integrity": "sha512-kGaNbPh1k7AFzgpud/gMdvIm5xuECykRR+JnWKQno9TAXVa6WIVCGTPvYGekIDL4uwCZQSYbUxNBSb1aUo79oA==", + "license": "MIT", + "dependencies": { + "@types/unist": "*" + } + }, + "node_modules/@types/ms": { + "version": "2.1.0", + "resolved": "https://registry.npmjs.org/@types/ms/-/ms-2.1.0.tgz", + "integrity": "sha512-GsCCIZDE/p3i96vtEqx+7dBUGXrc7zeSK3wwPHIaRThS+9OhWIXRqzs4d6k1SVU8g91DrNRWxWUGhp5KXQb2VA==", + "license": "MIT" + }, + "node_modules/@types/prop-types": { + "version": "15.7.15", + "resolved": "https://registry.npmjs.org/@types/prop-types/-/prop-types-15.7.15.tgz", + "integrity": "sha512-F6bEyamV9jKGAFBEmlQnesRPGOQqS2+Uwi0Em15xenOxHaf2hv6L8YCVn3rPdPJOiJfPiCnLIRyvwVaqMY3MIw==", + "license": "MIT" + }, + "node_modules/@types/react": { + "version": "18.3.27", + "resolved": "https://registry.npmjs.org/@types/react/-/react-18.3.27.tgz", + "integrity": "sha512-cisd7gxkzjBKU2GgdYrTdtQx1SORymWyaAFhaxQPK9bYO9ot3Y5OikQRvY0VYQtvwjeQnizCINJAenh/V7MK2w==", + "license": "MIT", + "dependencies": { + "@types/prop-types": "*", + "csstype": "^3.2.2" + } + }, + "node_modules/@types/react-dom": { + "version": "18.3.7", + "resolved": "https://registry.npmjs.org/@types/react-dom/-/react-dom-18.3.7.tgz", + "integrity": "sha512-MEe3UeoENYVFXzoXEWsvcpg6ZvlrFNlOQ7EOsvhI3CfAXwzPfO8Qwuxd40nepsYKqyyVQnTdEfv68q91yLcKrQ==", + "dev": true, + "license": "MIT", + "peerDependencies": { + "@types/react": "^18.0.0" + } + }, + "node_modules/@types/unist": { + "version": "3.0.3", + "resolved": "https://registry.npmjs.org/@types/unist/-/unist-3.0.3.tgz", + "integrity": "sha512-ko/gIFJRv177XgZsZcBwnqJN5x/Gien8qNOn0D5bQU/zAzVf9Zt3BlcUiLqhV9y4ARk0GbT3tnUiPNgnTXzc/Q==", + "license": "MIT" + }, + "node_modules/@ungap/structured-clone": { + "version": "1.3.0", + "resolved": "https://registry.npmjs.org/@ungap/structured-clone/-/structured-clone-1.3.0.tgz", + "integrity": "sha512-WmoN8qaIAo7WTYWbAZuG8PYEhn5fkz7dZrqTBZ7dtt//lL2Gwms1IcnQ5yHqjDfX8Ft5j4YzDM23f87zBfDe9g==", + "license": "ISC" + }, + "node_modules/@vitejs/plugin-react": { + "version": "4.7.0", + "resolved": "https://registry.npmjs.org/@vitejs/plugin-react/-/plugin-react-4.7.0.tgz", + "integrity": "sha512-gUu9hwfWvvEDBBmgtAowQCojwZmJ5mcLn3aufeCsitijs3+f2NsrPtlAWIR6OPiqljl96GVCUbLe0HyqIpVaoA==", + "dev": true, + "license": "MIT", + "dependencies": { + "@babel/core": "^7.28.0", + "@babel/plugin-transform-react-jsx-self": "^7.27.1", + "@babel/plugin-transform-react-jsx-source": "^7.27.1", + "@rolldown/pluginutils": "1.0.0-beta.27", + "@types/babel__core": "^7.20.5", + "react-refresh": "^0.17.0" + }, + "engines": { + "node": "^14.18.0 || >=16.0.0" + }, + "peerDependencies": { + "vite": "^4.2.0 || ^5.0.0 || ^6.0.0 || ^7.0.0" + } + }, + "node_modules/acorn": { + "version": "8.15.0", + "resolved": "https://registry.npmjs.org/acorn/-/acorn-8.15.0.tgz", + "integrity": "sha512-NZyJarBfL7nWwIq+FDL6Zp/yHEhePMNnnJ0y3qfieCrmNvYct8uvtiV41UvlSe6apAfk0fY1FbWx+NwfmpvtTg==", + "dev": true, + "license": "MIT", + "bin": { + "acorn": "bin/acorn" + }, + "engines": { + "node": ">=0.4.0" + } + }, + "node_modules/acorn-jsx": { + "version": "5.3.2", + "resolved": "https://registry.npmjs.org/acorn-jsx/-/acorn-jsx-5.3.2.tgz", + "integrity": "sha512-rq9s+JNhf0IChjtDXxllJ7g41oZk5SlXtp0LHwyA5cejwn7vKmKp4pPri6YEePv2PU65sAsegbXtIinmDFDXgQ==", + "dev": true, + "license": "MIT", + "peerDependencies": { + "acorn": "^6.0.0 || ^7.0.0 || ^8.0.0" + } + }, + "node_modules/ajv": { + "version": "6.12.6", + "resolved": "https://registry.npmjs.org/ajv/-/ajv-6.12.6.tgz", + "integrity": "sha512-j3fVLgvTo527anyYyJOGTYJbG+vnnQYvE0m5mmkc1TK+nxAppkCLMIL0aZ4dblVCNoGShhm+kzE4ZUykBoMg4g==", + "dev": true, + "license": "MIT", + "dependencies": { + "fast-deep-equal": "^3.1.1", + "fast-json-stable-stringify": "^2.0.0", + "json-schema-traverse": "^0.4.1", + "uri-js": "^4.2.2" + }, + "funding": { + "type": "github", + "url": "https://github.com/sponsors/epoberezkin" + } + }, + "node_modules/ansi-styles": { + "version": "4.3.0", + "resolved": "https://registry.npmjs.org/ansi-styles/-/ansi-styles-4.3.0.tgz", + "integrity": "sha512-zbB9rCJAT1rbjiVDb2hqKFHNYLxgtk8NURxZ3IZwD3F6NtxbXZQCnnSi1Lkx+IDohdPlFp222wVALIheZJQSEg==", + "dev": true, + "license": "MIT", + "dependencies": { + "color-convert": "^2.0.1" + }, + "engines": { + "node": ">=8" + }, + "funding": { + "url": "https://github.com/chalk/ansi-styles?sponsor=1" + } + }, + "node_modules/argparse": { + "version": "2.0.1", + "resolved": "https://registry.npmjs.org/argparse/-/argparse-2.0.1.tgz", + "integrity": "sha512-8+9WqebbFzpX9OR+Wa6O29asIogeRMzcGtAINdpMHHyAg10f05aSFVBbcEqGf/PXw1EjAZ+q2/bEBg3DvurK3Q==", + "dev": true, + "license": "Python-2.0" + }, + "node_modules/array-buffer-byte-length": { + "version": "1.0.2", + "resolved": "https://registry.npmjs.org/array-buffer-byte-length/-/array-buffer-byte-length-1.0.2.tgz", + "integrity": "sha512-LHE+8BuR7RYGDKvnrmcuSq3tDcKv9OFEXQt/HpbZhY7V6h0zlUXutnAD82GiFx9rdieCMjkvtcsPqBwgUl1Iiw==", + "dev": true, + "license": "MIT", + "dependencies": { + "call-bound": "^1.0.3", + "is-array-buffer": "^3.0.5" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/array-includes": { + "version": "3.1.9", + "resolved": "https://registry.npmjs.org/array-includes/-/array-includes-3.1.9.tgz", + "integrity": "sha512-FmeCCAenzH0KH381SPT5FZmiA/TmpndpcaShhfgEN9eCVjnFBqq3l1xrI42y8+PPLI6hypzou4GXw00WHmPBLQ==", + "dev": true, + "license": "MIT", + "dependencies": { + "call-bind": "^1.0.8", + "call-bound": "^1.0.4", + "define-properties": "^1.2.1", + "es-abstract": "^1.24.0", + "es-object-atoms": "^1.1.1", + "get-intrinsic": "^1.3.0", + "is-string": "^1.1.1", + "math-intrinsics": "^1.1.0" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/array.prototype.findlast": { + "version": "1.2.5", + "resolved": "https://registry.npmjs.org/array.prototype.findlast/-/array.prototype.findlast-1.2.5.tgz", + "integrity": "sha512-CVvd6FHg1Z3POpBLxO6E6zr+rSKEQ9L6rZHAaY7lLfhKsWYUBBOuMs0e9o24oopj6H+geRCX0YJ+TJLBK2eHyQ==", + "dev": true, + "license": "MIT", + "dependencies": { + "call-bind": "^1.0.7", + "define-properties": "^1.2.1", + "es-abstract": "^1.23.2", + "es-errors": "^1.3.0", + "es-object-atoms": "^1.0.0", + "es-shim-unscopables": "^1.0.2" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/array.prototype.flat": { + "version": "1.3.3", + "resolved": "https://registry.npmjs.org/array.prototype.flat/-/array.prototype.flat-1.3.3.tgz", + "integrity": "sha512-rwG/ja1neyLqCuGZ5YYrznA62D4mZXg0i1cIskIUKSiqF3Cje9/wXAls9B9s1Wa2fomMsIv8czB8jZcPmxCXFg==", + "dev": true, + "license": "MIT", + "dependencies": { + "call-bind": "^1.0.8", + "define-properties": "^1.2.1", + "es-abstract": "^1.23.5", + "es-shim-unscopables": "^1.0.2" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/array.prototype.flatmap": { + "version": "1.3.3", + "resolved": "https://registry.npmjs.org/array.prototype.flatmap/-/array.prototype.flatmap-1.3.3.tgz", + "integrity": "sha512-Y7Wt51eKJSyi80hFrJCePGGNo5ktJCslFuboqJsbf57CCPcm5zztluPlc4/aD8sWsKvlwatezpV4U1efk8kpjg==", + "dev": true, + "license": "MIT", + "dependencies": { + "call-bind": "^1.0.8", + "define-properties": "^1.2.1", + "es-abstract": "^1.23.5", + "es-shim-unscopables": "^1.0.2" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/array.prototype.tosorted": { + "version": "1.1.4", + "resolved": "https://registry.npmjs.org/array.prototype.tosorted/-/array.prototype.tosorted-1.1.4.tgz", + "integrity": "sha512-p6Fx8B7b7ZhL/gmUsAy0D15WhvDccw3mnGNbZpi3pmeJdxtWsj2jEaI4Y6oo3XiHfzuSgPwKc04MYt6KgvC/wA==", + "dev": true, + "license": "MIT", + "dependencies": { + "call-bind": "^1.0.7", + "define-properties": "^1.2.1", + "es-abstract": "^1.23.3", + "es-errors": "^1.3.0", + "es-shim-unscopables": "^1.0.2" + }, + "engines": { + "node": ">= 0.4" + } + }, + "node_modules/arraybuffer.prototype.slice": { + "version": "1.0.4", + "resolved": "https://registry.npmjs.org/arraybuffer.prototype.slice/-/arraybuffer.prototype.slice-1.0.4.tgz", + "integrity": "sha512-BNoCY6SXXPQ7gF2opIP4GBE+Xw7U+pHMYKuzjgCN3GwiaIR09UUeKfheyIry77QtrCBlC0KK0q5/TER/tYh3PQ==", + "dev": true, + "license": "MIT", + "dependencies": { + "array-buffer-byte-length": "^1.0.1", + "call-bind": "^1.0.8", + "define-properties": "^1.2.1", + "es-abstract": "^1.23.5", + "es-errors": "^1.3.0", + "get-intrinsic": "^1.2.6", + "is-array-buffer": "^3.0.4" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/async-function": { + "version": "1.0.0", + "resolved": "https://registry.npmjs.org/async-function/-/async-function-1.0.0.tgz", + "integrity": "sha512-hsU18Ae8CDTR6Kgu9DYf0EbCr/a5iGL0rytQDobUcdpYOKokk8LEjVphnXkDkgpi0wYVsqrXuP0bZxJaTqdgoA==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">= 0.4" + } + }, + "node_modules/autoprefixer": { + "version": "10.4.23", + "resolved": "https://registry.npmjs.org/autoprefixer/-/autoprefixer-10.4.23.tgz", + "integrity": "sha512-YYTXSFulfwytnjAPlw8QHncHJmlvFKtczb8InXaAx9Q0LbfDnfEYDE55omerIJKihhmU61Ft+cAOSzQVaBUmeA==", + "dev": true, + "funding": [ + { + "type": "opencollective", + "url": "https://opencollective.com/postcss/" + }, + { + "type": "tidelift", + "url": "https://tidelift.com/funding/github/npm/autoprefixer" + }, + { + "type": "github", + "url": "https://github.com/sponsors/ai" + } + ], + "license": "MIT", + "dependencies": { + "browserslist": "^4.28.1", + "caniuse-lite": "^1.0.30001760", + "fraction.js": "^5.3.4", + "picocolors": "^1.1.1", + "postcss-value-parser": "^4.2.0" + }, + "bin": { + "autoprefixer": "bin/autoprefixer" + }, + "engines": { + "node": "^10 || ^12 || >=14" + }, + "peerDependencies": { + "postcss": "^8.1.0" + } + }, + "node_modules/available-typed-arrays": { + "version": "1.0.7", + "resolved": "https://registry.npmjs.org/available-typed-arrays/-/available-typed-arrays-1.0.7.tgz", + "integrity": "sha512-wvUjBtSGN7+7SjNpq/9M2Tg350UZD3q62IFZLbRAR1bSMlCo1ZaeW+BJ+D090e4hIIZLBcTDWe4Mh4jvUDajzQ==", + "dev": true, + "license": "MIT", + "dependencies": { + "possible-typed-array-names": "^1.0.0" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/bail": { + "version": "2.0.2", + "resolved": "https://registry.npmjs.org/bail/-/bail-2.0.2.tgz", + "integrity": "sha512-0xO6mYd7JB2YesxDKplafRpsiOzPt9V02ddPCLbY1xYGPOX24NTyN50qnUxgCPcSoYMhKpAuBTjQoRZCAkUDRw==", + "license": "MIT", + "funding": { + "type": "github", + "url": "https://github.com/sponsors/wooorm" + } + }, + "node_modules/balanced-match": { + "version": "1.0.2", + "resolved": "https://registry.npmjs.org/balanced-match/-/balanced-match-1.0.2.tgz", + "integrity": "sha512-3oSeUO0TMV67hN1AmbXsK4yaqU7tjiHlbxRDZOpH0KW9+CeX4bRAaX0Anxt0tx2MrpRpWwQaPwIlISEJhYU5Pw==", + "dev": true, + "license": "MIT" + }, + "node_modules/baseline-browser-mapping": { + "version": "2.9.14", + "resolved": "https://registry.npmjs.org/baseline-browser-mapping/-/baseline-browser-mapping-2.9.14.tgz", + "integrity": "sha512-B0xUquLkiGLgHhpPBqvl7GWegWBUNuujQ6kXd/r1U38ElPT6Ok8KZ8e+FpUGEc2ZoRQUzq/aUnaKFc/svWUGSg==", + "dev": true, + "license": "Apache-2.0", + "bin": { + "baseline-browser-mapping": "dist/cli.js" + } + }, + "node_modules/brace-expansion": { + "version": "1.1.12", + "resolved": "https://registry.npmjs.org/brace-expansion/-/brace-expansion-1.1.12.tgz", + "integrity": "sha512-9T9UjW3r0UW5c1Q7GTwllptXwhvYmEzFhzMfZ9H7FQWt+uZePjZPjBP/W1ZEyZ1twGWom5/56TF4lPcqjnDHcg==", + "dev": true, + "license": "MIT", + "dependencies": { + "balanced-match": "^1.0.0", + "concat-map": "0.0.1" + } + }, + "node_modules/browserslist": { + "version": "4.28.1", + "resolved": "https://registry.npmjs.org/browserslist/-/browserslist-4.28.1.tgz", + "integrity": "sha512-ZC5Bd0LgJXgwGqUknZY/vkUQ04r8NXnJZ3yYi4vDmSiZmC/pdSN0NbNRPxZpbtO4uAfDUAFffO8IZoM3Gj8IkA==", + "dev": true, + "funding": [ + { + "type": "opencollective", + "url": "https://opencollective.com/browserslist" + }, + { + "type": "tidelift", + "url": "https://tidelift.com/funding/github/npm/browserslist" + }, + { + "type": "github", + "url": "https://github.com/sponsors/ai" + } + ], + "license": "MIT", + "dependencies": { + "baseline-browser-mapping": "^2.9.0", + "caniuse-lite": "^1.0.30001759", + "electron-to-chromium": "^1.5.263", + "node-releases": "^2.0.27", + "update-browserslist-db": "^1.2.0" + }, + "bin": { + "browserslist": "cli.js" + }, + "engines": { + "node": "^6 || ^7 || ^8 || ^9 || ^10 || ^11 || ^12 || >=13.7" + } + }, + "node_modules/call-bind": { + "version": "1.0.8", + "resolved": "https://registry.npmjs.org/call-bind/-/call-bind-1.0.8.tgz", + "integrity": "sha512-oKlSFMcMwpUg2ednkhQ454wfWiU/ul3CkJe/PEHcTKuiX6RpbehUiFMXu13HalGZxfUwCQzZG747YXBn1im9ww==", + "dev": true, + "license": "MIT", + "dependencies": { + "call-bind-apply-helpers": "^1.0.0", + "es-define-property": "^1.0.0", + "get-intrinsic": "^1.2.4", + "set-function-length": "^1.2.2" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/call-bind-apply-helpers": { + "version": "1.0.2", + "resolved": "https://registry.npmjs.org/call-bind-apply-helpers/-/call-bind-apply-helpers-1.0.2.tgz", + "integrity": "sha512-Sp1ablJ0ivDkSzjcaJdxEunN5/XvksFJ2sMBFfq6x0ryhQV/2b/KwFe21cMpmHtPOSij8K99/wSfoEuTObmuMQ==", + "dev": true, + "license": "MIT", + "dependencies": { + "es-errors": "^1.3.0", + "function-bind": "^1.1.2" + }, + "engines": { + "node": ">= 0.4" + } + }, + "node_modules/call-bound": { + "version": "1.0.4", + "resolved": "https://registry.npmjs.org/call-bound/-/call-bound-1.0.4.tgz", + "integrity": "sha512-+ys997U96po4Kx/ABpBCqhA9EuxJaQWDQg7295H4hBphv3IZg0boBKuwYpt4YXp6MZ5AmZQnU/tyMTlRpaSejg==", + "dev": true, + "license": "MIT", + "dependencies": { + "call-bind-apply-helpers": "^1.0.2", + "get-intrinsic": "^1.3.0" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/callsites": { + "version": "3.1.0", + "resolved": "https://registry.npmjs.org/callsites/-/callsites-3.1.0.tgz", + "integrity": "sha512-P8BjAsXvZS+VIDUI11hHCQEv74YT67YUi5JJFNWIqL235sBmjX4+qx9Muvls5ivyNENctx46xQLQ3aTuE7ssaQ==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">=6" + } + }, + "node_modules/caniuse-lite": { + "version": "1.0.30001764", + "resolved": "https://registry.npmjs.org/caniuse-lite/-/caniuse-lite-1.0.30001764.tgz", + "integrity": "sha512-9JGuzl2M+vPL+pz70gtMF9sHdMFbY9FJaQBi186cHKH3pSzDvzoUJUPV6fqiKIMyXbud9ZLg4F3Yza1vJ1+93g==", + "dev": true, + "funding": [ + { + "type": "opencollective", + "url": "https://opencollective.com/browserslist" + }, + { + "type": "tidelift", + "url": "https://tidelift.com/funding/github/npm/caniuse-lite" + }, + { + "type": "github", + "url": "https://github.com/sponsors/ai" + } + ], + "license": "CC-BY-4.0" + }, + "node_modules/ccount": { + "version": "2.0.1", + "resolved": "https://registry.npmjs.org/ccount/-/ccount-2.0.1.tgz", + "integrity": "sha512-eyrF0jiFpY+3drT6383f1qhkbGsLSifNAjA61IUjZjmLCWjItY6LB9ft9YhoDgwfmclB2zhu51Lc7+95b8NRAg==", + "license": "MIT", + "funding": { + "type": "github", + "url": "https://github.com/sponsors/wooorm" + } + }, + "node_modules/chalk": { + "version": "4.1.2", + "resolved": "https://registry.npmjs.org/chalk/-/chalk-4.1.2.tgz", + "integrity": "sha512-oKnbhFyRIXpUuez8iBMmyEa4nbj4IOQyuhc/wy9kY7/WVPcwIO9VA668Pu8RkO7+0G76SLROeyw9CpQ061i4mA==", + "dev": true, + "license": "MIT", + "dependencies": { + "ansi-styles": "^4.1.0", + "supports-color": "^7.1.0" + }, + "engines": { + "node": ">=10" + }, + "funding": { + "url": "https://github.com/chalk/chalk?sponsor=1" + } + }, + "node_modules/character-entities": { + "version": "2.0.2", + "resolved": "https://registry.npmjs.org/character-entities/-/character-entities-2.0.2.tgz", + "integrity": "sha512-shx7oQ0Awen/BRIdkjkvz54PnEEI/EjwXDSIZp86/KKdbafHh1Df/RYGBhn4hbe2+uKC9FnT5UCEdyPz3ai9hQ==", + "license": "MIT", + "funding": { + "type": "github", + "url": "https://github.com/sponsors/wooorm" + } + }, + "node_modules/character-entities-html4": { + "version": "2.1.0", + "resolved": "https://registry.npmjs.org/character-entities-html4/-/character-entities-html4-2.1.0.tgz", + "integrity": "sha512-1v7fgQRj6hnSwFpq1Eu0ynr/CDEw0rXo2B61qXrLNdHZmPKgb7fqS1a2JwF0rISo9q77jDI8VMEHoApn8qDoZA==", + "license": "MIT", + "funding": { + "type": "github", + "url": "https://github.com/sponsors/wooorm" + } + }, + "node_modules/character-entities-legacy": { + "version": "3.0.0", + "resolved": "https://registry.npmjs.org/character-entities-legacy/-/character-entities-legacy-3.0.0.tgz", + "integrity": "sha512-RpPp0asT/6ufRm//AJVwpViZbGM/MkjQFxJccQRHmISF/22NBtsHqAWmL+/pmkPWoIUJdWyeVleTl1wydHATVQ==", + "license": "MIT", + "funding": { + "type": "github", + "url": "https://github.com/sponsors/wooorm" + } + }, + "node_modules/character-reference-invalid": { + "version": "2.0.1", + "resolved": "https://registry.npmjs.org/character-reference-invalid/-/character-reference-invalid-2.0.1.tgz", + "integrity": "sha512-iBZ4F4wRbyORVsu0jPV7gXkOsGYjGHPmAyv+HiHG8gi5PtC9KI2j1+v8/tlibRvjoWX027ypmG/n0HtO5t7unw==", + "license": "MIT", + "funding": { + "type": "github", + "url": "https://github.com/sponsors/wooorm" + } + }, + "node_modules/color-convert": { + "version": "2.0.1", + "resolved": "https://registry.npmjs.org/color-convert/-/color-convert-2.0.1.tgz", + "integrity": "sha512-RRECPsj7iu/xb5oKYcsFHSppFNnsj/52OVTRKb4zP5onXwVF3zVmmToNcOfGC+CRDpfK/U584fMg38ZHCaElKQ==", + "dev": true, + "license": "MIT", + "dependencies": { + "color-name": "~1.1.4" + }, + "engines": { + "node": ">=7.0.0" + } + }, + "node_modules/color-name": { + "version": "1.1.4", + "resolved": "https://registry.npmjs.org/color-name/-/color-name-1.1.4.tgz", + "integrity": "sha512-dOy+3AuW3a2wNbZHIuMZpTcgjGuLU/uBL/ubcZF9OXbDo8ff4O8yVp5Bf0efS8uEoYo5q4Fx7dY9OgQGXgAsQA==", + "dev": true, + "license": "MIT" + }, + "node_modules/comma-separated-tokens": { + "version": "2.0.3", + "resolved": "https://registry.npmjs.org/comma-separated-tokens/-/comma-separated-tokens-2.0.3.tgz", + "integrity": "sha512-Fu4hJdvzeylCfQPp9SGWidpzrMs7tTrlu6Vb8XGaRGck8QSNZJJp538Wrb60Lax4fPwR64ViY468OIUTbRlGZg==", + "license": "MIT", + "funding": { + "type": "github", + "url": "https://github.com/sponsors/wooorm" + } + }, + "node_modules/concat-map": { + "version": "0.0.1", + "resolved": "https://registry.npmjs.org/concat-map/-/concat-map-0.0.1.tgz", + "integrity": "sha512-/Srv4dswyQNBfohGpz9o6Yb3Gz3SrUDqBH5rTuhGR7ahtlbYKnVxw2bCFMRljaA7EXHaXZ8wsHdodFvbkhKmqg==", + "dev": true, + "license": "MIT" + }, + "node_modules/convert-source-map": { + "version": "2.0.0", + "resolved": "https://registry.npmjs.org/convert-source-map/-/convert-source-map-2.0.0.tgz", + "integrity": "sha512-Kvp459HrV2FEJ1CAsi1Ku+MY3kasH19TFykTz2xWmMeq6bk2NU3XXvfJ+Q61m0xktWwt+1HSYf3JZsTms3aRJg==", + "dev": true, + "license": "MIT" + }, + "node_modules/cross-spawn": { + "version": "7.0.6", + "resolved": "https://registry.npmjs.org/cross-spawn/-/cross-spawn-7.0.6.tgz", + "integrity": "sha512-uV2QOWP2nWzsy2aMp8aRibhi9dlzF5Hgh5SHaB9OiTGEyDTiJJyx0uy51QXdyWbtAHNua4XJzUKca3OzKUd3vA==", + "dev": true, + "license": "MIT", + "dependencies": { + "path-key": "^3.1.0", + "shebang-command": "^2.0.0", + "which": "^2.0.1" + }, + "engines": { + "node": ">= 8" + } + }, + "node_modules/csstype": { + "version": "3.2.3", + "resolved": "https://registry.npmjs.org/csstype/-/csstype-3.2.3.tgz", + "integrity": "sha512-z1HGKcYy2xA8AGQfwrn0PAy+PB7X/GSj3UVJW9qKyn43xWa+gl5nXmU4qqLMRzWVLFC8KusUX8T/0kCiOYpAIQ==", + "license": "MIT" + }, + "node_modules/data-view-buffer": { + "version": "1.0.2", + "resolved": "https://registry.npmjs.org/data-view-buffer/-/data-view-buffer-1.0.2.tgz", + "integrity": "sha512-EmKO5V3OLXh1rtK2wgXRansaK1/mtVdTUEiEI0W8RkvgT05kfxaH29PliLnpLP73yYO6142Q72QNa8Wx/A5CqQ==", + "dev": true, + "license": "MIT", + "dependencies": { + "call-bound": "^1.0.3", + "es-errors": "^1.3.0", + "is-data-view": "^1.0.2" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/data-view-byte-length": { + "version": "1.0.2", + "resolved": "https://registry.npmjs.org/data-view-byte-length/-/data-view-byte-length-1.0.2.tgz", + "integrity": "sha512-tuhGbE6CfTM9+5ANGf+oQb72Ky/0+s3xKUpHvShfiz2RxMFgFPjsXuRLBVMtvMs15awe45SRb83D6wH4ew6wlQ==", + "dev": true, + "license": "MIT", + "dependencies": { + "call-bound": "^1.0.3", + "es-errors": "^1.3.0", + "is-data-view": "^1.0.2" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/inspect-js" + } + }, + "node_modules/data-view-byte-offset": { + "version": "1.0.1", + "resolved": "https://registry.npmjs.org/data-view-byte-offset/-/data-view-byte-offset-1.0.1.tgz", + "integrity": "sha512-BS8PfmtDGnrgYdOonGZQdLZslWIeCGFP9tpan0hi1Co2Zr2NKADsvGYA8XxuG/4UWgJ6Cjtv+YJnB6MM69QGlQ==", + "dev": true, + "license": "MIT", + "dependencies": { + "call-bound": "^1.0.2", + "es-errors": "^1.3.0", + "is-data-view": "^1.0.1" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/debug": { + "version": "4.4.3", + "resolved": "https://registry.npmjs.org/debug/-/debug-4.4.3.tgz", + "integrity": "sha512-RGwwWnwQvkVfavKVt22FGLw+xYSdzARwm0ru6DhTVA3umU5hZc28V3kO4stgYryrTlLpuvgI9GiijltAjNbcqA==", + "license": "MIT", + "dependencies": { + "ms": "^2.1.3" + }, + "engines": { + "node": ">=6.0" + }, + "peerDependenciesMeta": { + "supports-color": { + "optional": true + } + } + }, + "node_modules/decode-named-character-reference": { + "version": "1.2.0", + "resolved": "https://registry.npmjs.org/decode-named-character-reference/-/decode-named-character-reference-1.2.0.tgz", + "integrity": "sha512-c6fcElNV6ShtZXmsgNgFFV5tVX2PaV4g+MOAkb8eXHvn6sryJBrZa9r0zV6+dtTyoCKxtDy5tyQ5ZwQuidtd+Q==", + "license": "MIT", + "dependencies": { + "character-entities": "^2.0.0" + }, + "funding": { + "type": "github", + "url": "https://github.com/sponsors/wooorm" + } + }, + "node_modules/deep-is": { + "version": "0.1.4", + "resolved": "https://registry.npmjs.org/deep-is/-/deep-is-0.1.4.tgz", + "integrity": "sha512-oIPzksmTg4/MriiaYGO+okXDT7ztn/w3Eptv/+gSIdMdKsJo0u4CfYNFJPy+4SKMuCqGw2wxnA+URMg3t8a/bQ==", + "dev": true, + "license": "MIT" + }, + "node_modules/define-data-property": { + "version": "1.1.4", + "resolved": "https://registry.npmjs.org/define-data-property/-/define-data-property-1.1.4.tgz", + "integrity": "sha512-rBMvIzlpA8v6E+SJZoo++HAYqsLrkg7MSfIinMPFhmkorw7X+dOXVJQs+QT69zGkzMyfDnIMN2Wid1+NbL3T+A==", + "dev": true, + "license": "MIT", + "dependencies": { + "es-define-property": "^1.0.0", + "es-errors": "^1.3.0", + "gopd": "^1.0.1" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/define-properties": { + "version": "1.2.1", + "resolved": "https://registry.npmjs.org/define-properties/-/define-properties-1.2.1.tgz", + "integrity": "sha512-8QmQKqEASLd5nx0U1B1okLElbUuuttJ/AnYmRXbbbGDWh6uS208EjD4Xqq/I9wK7u0v6O08XhTWnt5XtEbR6Dg==", + "dev": true, + "license": "MIT", + "dependencies": { + "define-data-property": "^1.0.1", + "has-property-descriptors": "^1.0.0", + "object-keys": "^1.1.1" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/dequal": { + "version": "2.0.3", + "resolved": "https://registry.npmjs.org/dequal/-/dequal-2.0.3.tgz", + "integrity": "sha512-0je+qPKHEMohvfRTCEo3CrPG6cAzAYgmzKyxRiYSSDkS6eGJdyVJm7WaYA5ECaAD9wLB2T4EEeymA5aFVcYXCA==", + "license": "MIT", + "engines": { + "node": ">=6" + } + }, + "node_modules/detect-libc": { + "version": "2.1.2", + "resolved": "https://registry.npmjs.org/detect-libc/-/detect-libc-2.1.2.tgz", + "integrity": "sha512-Btj2BOOO83o3WyH59e8MgXsxEQVcarkUOpEYrubB0urwnN10yQ364rsiByU11nZlqWYZm05i/of7io4mzihBtQ==", + "dev": true, + "license": "Apache-2.0", + "engines": { + "node": ">=8" + } + }, + "node_modules/devlop": { + "version": "1.1.0", + "resolved": "https://registry.npmjs.org/devlop/-/devlop-1.1.0.tgz", + "integrity": "sha512-RWmIqhcFf1lRYBvNmr7qTNuyCt/7/ns2jbpp1+PalgE/rDQcBT0fioSMUpJ93irlUhC5hrg4cYqe6U+0ImW0rA==", + "license": "MIT", + "dependencies": { + "dequal": "^2.0.0" + }, + "funding": { + "type": "github", + "url": "https://github.com/sponsors/wooorm" + } + }, + "node_modules/doctrine": { + "version": "2.1.0", + "resolved": "https://registry.npmjs.org/doctrine/-/doctrine-2.1.0.tgz", + "integrity": "sha512-35mSku4ZXK0vfCuHEDAwt55dg2jNajHZ1odvF+8SSr82EsZY4QmXfuWso8oEd8zRhVObSN18aM0CjSdoBX7zIw==", + "dev": true, + "license": "Apache-2.0", + "dependencies": { + "esutils": "^2.0.2" + }, + "engines": { + "node": ">=0.10.0" + } + }, + "node_modules/dunder-proto": { + "version": "1.0.1", + "resolved": "https://registry.npmjs.org/dunder-proto/-/dunder-proto-1.0.1.tgz", + "integrity": "sha512-KIN/nDJBQRcXw0MLVhZE9iQHmG68qAVIBg9CqmUYjmQIhgij9U5MFvrqkUL5FbtyyzZuOeOt0zdeRe4UY7ct+A==", + "dev": true, + "license": "MIT", + "dependencies": { + "call-bind-apply-helpers": "^1.0.1", + "es-errors": "^1.3.0", + "gopd": "^1.2.0" + }, + "engines": { + "node": ">= 0.4" + } + }, + "node_modules/electron-to-chromium": { + "version": "1.5.267", + "resolved": "https://registry.npmjs.org/electron-to-chromium/-/electron-to-chromium-1.5.267.tgz", + "integrity": "sha512-0Drusm6MVRXSOJpGbaSVgcQsuB4hEkMpHXaVstcPmhu5LIedxs1xNK/nIxmQIU/RPC0+1/o0AVZfBTkTNJOdUw==", + "dev": true, + "license": "ISC" + }, + "node_modules/enhanced-resolve": { + "version": "5.18.4", + "resolved": "https://registry.npmjs.org/enhanced-resolve/-/enhanced-resolve-5.18.4.tgz", + "integrity": "sha512-LgQMM4WXU3QI+SYgEc2liRgznaD5ojbmY3sb8LxyguVkIg5FxdpTkvk72te2R38/TGKxH634oLxXRGY6d7AP+Q==", + "dev": true, + "license": "MIT", + "dependencies": { + "graceful-fs": "^4.2.4", + "tapable": "^2.2.0" + }, + "engines": { + "node": ">=10.13.0" + } + }, + "node_modules/es-abstract": { + "version": "1.24.1", + "resolved": "https://registry.npmjs.org/es-abstract/-/es-abstract-1.24.1.tgz", + "integrity": "sha512-zHXBLhP+QehSSbsS9Pt23Gg964240DPd6QCf8WpkqEXxQ7fhdZzYsocOr5u7apWonsS5EjZDmTF+/slGMyasvw==", + "dev": true, + "license": "MIT", + "dependencies": { + "array-buffer-byte-length": "^1.0.2", + "arraybuffer.prototype.slice": "^1.0.4", + "available-typed-arrays": "^1.0.7", + "call-bind": "^1.0.8", + "call-bound": "^1.0.4", + "data-view-buffer": "^1.0.2", + "data-view-byte-length": "^1.0.2", + "data-view-byte-offset": "^1.0.1", + "es-define-property": "^1.0.1", + "es-errors": "^1.3.0", + "es-object-atoms": "^1.1.1", + "es-set-tostringtag": "^2.1.0", + "es-to-primitive": "^1.3.0", + "function.prototype.name": "^1.1.8", + "get-intrinsic": "^1.3.0", + "get-proto": "^1.0.1", + "get-symbol-description": "^1.1.0", + "globalthis": "^1.0.4", + "gopd": "^1.2.0", + "has-property-descriptors": "^1.0.2", + "has-proto": "^1.2.0", + "has-symbols": "^1.1.0", + "hasown": "^2.0.2", + "internal-slot": "^1.1.0", + "is-array-buffer": "^3.0.5", + "is-callable": "^1.2.7", + "is-data-view": "^1.0.2", + "is-negative-zero": "^2.0.3", + "is-regex": "^1.2.1", + "is-set": "^2.0.3", + "is-shared-array-buffer": "^1.0.4", + "is-string": "^1.1.1", + "is-typed-array": "^1.1.15", + "is-weakref": "^1.1.1", + "math-intrinsics": "^1.1.0", + "object-inspect": "^1.13.4", + "object-keys": "^1.1.1", + "object.assign": "^4.1.7", + "own-keys": "^1.0.1", + "regexp.prototype.flags": "^1.5.4", + "safe-array-concat": "^1.1.3", + "safe-push-apply": "^1.0.0", + "safe-regex-test": "^1.1.0", + "set-proto": "^1.0.0", + "stop-iteration-iterator": "^1.1.0", + "string.prototype.trim": "^1.2.10", + "string.prototype.trimend": "^1.0.9", + "string.prototype.trimstart": "^1.0.8", + "typed-array-buffer": "^1.0.3", + "typed-array-byte-length": "^1.0.3", + "typed-array-byte-offset": "^1.0.4", + "typed-array-length": "^1.0.7", + "unbox-primitive": "^1.1.0", + "which-typed-array": "^1.1.19" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/es-define-property": { + "version": "1.0.1", + "resolved": "https://registry.npmjs.org/es-define-property/-/es-define-property-1.0.1.tgz", + "integrity": "sha512-e3nRfgfUZ4rNGL232gUgX06QNyyez04KdjFrF+LTRoOXmrOgFKDg4BCdsjW8EnT69eqdYGmRpJwiPVYNrCaW3g==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">= 0.4" + } + }, + "node_modules/es-errors": { + "version": "1.3.0", + "resolved": "https://registry.npmjs.org/es-errors/-/es-errors-1.3.0.tgz", + "integrity": "sha512-Zf5H2Kxt2xjTvbJvP2ZWLEICxA6j+hAmMzIlypy4xcBg1vKVnx89Wy0GbS+kf5cwCVFFzdCFh2XSCFNULS6csw==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">= 0.4" + } + }, + "node_modules/es-iterator-helpers": { + "version": "1.2.2", + "resolved": "https://registry.npmjs.org/es-iterator-helpers/-/es-iterator-helpers-1.2.2.tgz", + "integrity": "sha512-BrUQ0cPTB/IwXj23HtwHjS9n7O4h9FX94b4xc5zlTHxeLgTAdzYUDyy6KdExAl9lbN5rtfe44xpjpmj9grxs5w==", + "dev": true, + "license": "MIT", + "dependencies": { + "call-bind": "^1.0.8", + "call-bound": "^1.0.4", + "define-properties": "^1.2.1", + "es-abstract": "^1.24.1", + "es-errors": "^1.3.0", + "es-set-tostringtag": "^2.1.0", + "function-bind": "^1.1.2", + "get-intrinsic": "^1.3.0", + "globalthis": "^1.0.4", + "gopd": "^1.2.0", + "has-property-descriptors": "^1.0.2", + "has-proto": "^1.2.0", + "has-symbols": "^1.1.0", + "internal-slot": "^1.1.0", + "iterator.prototype": "^1.1.5", + "safe-array-concat": "^1.1.3" + }, + "engines": { + "node": ">= 0.4" + } + }, + "node_modules/es-object-atoms": { + "version": "1.1.1", + "resolved": "https://registry.npmjs.org/es-object-atoms/-/es-object-atoms-1.1.1.tgz", + "integrity": "sha512-FGgH2h8zKNim9ljj7dankFPcICIK9Cp5bm+c2gQSYePhpaG5+esrLODihIorn+Pe6FGJzWhXQotPv73jTaldXA==", + "dev": true, + "license": "MIT", + "dependencies": { + "es-errors": "^1.3.0" + }, + "engines": { + "node": ">= 0.4" + } + }, + "node_modules/es-set-tostringtag": { + "version": "2.1.0", + "resolved": "https://registry.npmjs.org/es-set-tostringtag/-/es-set-tostringtag-2.1.0.tgz", + "integrity": "sha512-j6vWzfrGVfyXxge+O0x5sh6cvxAog0a/4Rdd2K36zCMV5eJ+/+tOAngRO8cODMNWbVRdVlmGZQL2YS3yR8bIUA==", + "dev": true, + "license": "MIT", + "dependencies": { + "es-errors": "^1.3.0", + "get-intrinsic": "^1.2.6", + "has-tostringtag": "^1.0.2", + "hasown": "^2.0.2" + }, + "engines": { + "node": ">= 0.4" + } + }, + "node_modules/es-shim-unscopables": { + "version": "1.1.0", + "resolved": "https://registry.npmjs.org/es-shim-unscopables/-/es-shim-unscopables-1.1.0.tgz", + "integrity": "sha512-d9T8ucsEhh8Bi1woXCf+TIKDIROLG5WCkxg8geBCbvk22kzwC5G2OnXVMO6FUsvQlgUUXQ2itephWDLqDzbeCw==", + "dev": true, + "license": "MIT", + "dependencies": { + "hasown": "^2.0.2" + }, + "engines": { + "node": ">= 0.4" + } + }, + "node_modules/es-to-primitive": { + "version": "1.3.0", + "resolved": "https://registry.npmjs.org/es-to-primitive/-/es-to-primitive-1.3.0.tgz", + "integrity": "sha512-w+5mJ3GuFL+NjVtJlvydShqE1eN3h3PbI7/5LAsYJP/2qtuMXjfL2LpHSRqo4b4eSF5K/DH1JXKUAHSB2UW50g==", + "dev": true, + "license": "MIT", + "dependencies": { + "is-callable": "^1.2.7", + "is-date-object": "^1.0.5", + "is-symbol": "^1.0.4" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/esbuild": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/esbuild/-/esbuild-0.25.12.tgz", + "integrity": "sha512-bbPBYYrtZbkt6Os6FiTLCTFxvq4tt3JKall1vRwshA3fdVztsLAatFaZobhkBC8/BrPetoa0oksYoKXoG4ryJg==", + "dev": true, + "hasInstallScript": true, + "license": "MIT", + "bin": { + "esbuild": "bin/esbuild" + }, + "engines": { + "node": ">=18" + }, + "optionalDependencies": { + "@esbuild/aix-ppc64": "0.25.12", + "@esbuild/android-arm": "0.25.12", + "@esbuild/android-arm64": "0.25.12", + "@esbuild/android-x64": "0.25.12", + "@esbuild/darwin-arm64": "0.25.12", + "@esbuild/darwin-x64": "0.25.12", + "@esbuild/freebsd-arm64": "0.25.12", + "@esbuild/freebsd-x64": "0.25.12", + "@esbuild/linux-arm": "0.25.12", + "@esbuild/linux-arm64": "0.25.12", + "@esbuild/linux-ia32": "0.25.12", + "@esbuild/linux-loong64": "0.25.12", + "@esbuild/linux-mips64el": "0.25.12", + "@esbuild/linux-ppc64": "0.25.12", + "@esbuild/linux-riscv64": "0.25.12", + "@esbuild/linux-s390x": "0.25.12", + "@esbuild/linux-x64": "0.25.12", + "@esbuild/netbsd-arm64": "0.25.12", + "@esbuild/netbsd-x64": "0.25.12", + "@esbuild/openbsd-arm64": "0.25.12", + "@esbuild/openbsd-x64": "0.25.12", + "@esbuild/openharmony-arm64": "0.25.12", + "@esbuild/sunos-x64": "0.25.12", + "@esbuild/win32-arm64": "0.25.12", + "@esbuild/win32-ia32": "0.25.12", + "@esbuild/win32-x64": "0.25.12" + } + }, + "node_modules/escalade": { + "version": "3.2.0", + "resolved": "https://registry.npmjs.org/escalade/-/escalade-3.2.0.tgz", + "integrity": "sha512-WUj2qlxaQtO4g6Pq5c29GTcWGDyd8itL8zTlipgECz3JesAiiOKotd8JU6otB3PACgG6xkJUyVhboMS+bje/jA==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">=6" + } + }, + "node_modules/escape-string-regexp": { + "version": "4.0.0", + "resolved": "https://registry.npmjs.org/escape-string-regexp/-/escape-string-regexp-4.0.0.tgz", + "integrity": "sha512-TtpcNJ3XAzx3Gq8sWRzJaVajRs0uVxA2YAkdb1jm2YkPz4G6egUFAyA3n5vtEIZefPk5Wa4UXbKuS5fKkJWdgA==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">=10" + }, + "funding": { + "url": "https://github.com/sponsors/sindresorhus" + } + }, + "node_modules/eslint": { + "version": "9.39.2", + "resolved": "https://registry.npmjs.org/eslint/-/eslint-9.39.2.tgz", + "integrity": "sha512-LEyamqS7W5HB3ujJyvi0HQK/dtVINZvd5mAAp9eT5S/ujByGjiZLCzPcHVzuXbpJDJF/cxwHlfceVUDZ2lnSTw==", + "dev": true, + "license": "MIT", + "dependencies": { + "@eslint-community/eslint-utils": "^4.8.0", + "@eslint-community/regexpp": "^4.12.1", + "@eslint/config-array": "^0.21.1", + "@eslint/config-helpers": "^0.4.2", + "@eslint/core": "^0.17.0", + "@eslint/eslintrc": "^3.3.1", + "@eslint/js": "9.39.2", + "@eslint/plugin-kit": "^0.4.1", + "@humanfs/node": "^0.16.6", + "@humanwhocodes/module-importer": "^1.0.1", + "@humanwhocodes/retry": "^0.4.2", + "@types/estree": "^1.0.6", + "ajv": "^6.12.4", + "chalk": "^4.0.0", + "cross-spawn": "^7.0.6", + "debug": "^4.3.2", + "escape-string-regexp": "^4.0.0", + "eslint-scope": "^8.4.0", + "eslint-visitor-keys": "^4.2.1", + "espree": "^10.4.0", + "esquery": "^1.5.0", + "esutils": "^2.0.2", + "fast-deep-equal": "^3.1.3", + "file-entry-cache": "^8.0.0", + "find-up": "^5.0.0", + "glob-parent": "^6.0.2", + "ignore": "^5.2.0", + "imurmurhash": "^0.1.4", + "is-glob": "^4.0.0", + "json-stable-stringify-without-jsonify": "^1.0.1", + "lodash.merge": "^4.6.2", + "minimatch": "^3.1.2", + "natural-compare": "^1.4.0", + "optionator": "^0.9.3" + }, + "bin": { + "eslint": "bin/eslint.js" + }, + "engines": { + "node": "^18.18.0 || ^20.9.0 || >=21.1.0" + }, + "funding": { + "url": "https://eslint.org/donate" + }, + "peerDependencies": { + "jiti": "*" + }, + "peerDependenciesMeta": { + "jiti": { + "optional": true + } + } + }, + "node_modules/eslint-plugin-react": { + "version": "7.37.5", + "resolved": "https://registry.npmjs.org/eslint-plugin-react/-/eslint-plugin-react-7.37.5.tgz", + "integrity": "sha512-Qteup0SqU15kdocexFNAJMvCJEfa2xUKNV4CC1xsVMrIIqEy3SQ/rqyxCWNzfrd3/ldy6HMlD2e0JDVpDg2qIA==", + "dev": true, + "license": "MIT", + "dependencies": { + "array-includes": "^3.1.8", + "array.prototype.findlast": "^1.2.5", + "array.prototype.flatmap": "^1.3.3", + "array.prototype.tosorted": "^1.1.4", + "doctrine": "^2.1.0", + "es-iterator-helpers": "^1.2.1", + "estraverse": "^5.3.0", + "hasown": "^2.0.2", + "jsx-ast-utils": "^2.4.1 || ^3.0.0", + "minimatch": "^3.1.2", + "object.entries": "^1.1.9", + "object.fromentries": "^2.0.8", + "object.values": "^1.2.1", + "prop-types": "^15.8.1", + "resolve": "^2.0.0-next.5", + "semver": "^6.3.1", + "string.prototype.matchall": "^4.0.12", + "string.prototype.repeat": "^1.0.0" + }, + "engines": { + "node": ">=4" + }, + "peerDependencies": { + "eslint": "^3 || ^4 || ^5 || ^6 || ^7 || ^8 || ^9.7" + } + }, + "node_modules/eslint-plugin-react-hooks": { + "version": "5.2.0", + "resolved": "https://registry.npmjs.org/eslint-plugin-react-hooks/-/eslint-plugin-react-hooks-5.2.0.tgz", + "integrity": "sha512-+f15FfK64YQwZdJNELETdn5ibXEUQmW1DZL6KXhNnc2heoy/sg9VJJeT7n8TlMWouzWqSWavFkIhHyIbIAEapg==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">=10" + }, + "peerDependencies": { + "eslint": "^3.0.0 || ^4.0.0 || ^5.0.0 || ^6.0.0 || ^7.0.0 || ^8.0.0-0 || ^9.0.0" + } + }, + "node_modules/eslint-plugin-react-refresh": { + "version": "0.4.26", + "resolved": "https://registry.npmjs.org/eslint-plugin-react-refresh/-/eslint-plugin-react-refresh-0.4.26.tgz", + "integrity": "sha512-1RETEylht2O6FM/MvgnyvT+8K21wLqDNg4qD51Zj3guhjt433XbnnkVttHMyaVyAFD03QSV4LPS5iE3VQmO7XQ==", + "dev": true, + "license": "MIT", + "peerDependencies": { + "eslint": ">=8.40" + } + }, + "node_modules/eslint-scope": { + "version": "8.4.0", + "resolved": "https://registry.npmjs.org/eslint-scope/-/eslint-scope-8.4.0.tgz", + "integrity": "sha512-sNXOfKCn74rt8RICKMvJS7XKV/Xk9kA7DyJr8mJik3S7Cwgy3qlkkmyS2uQB3jiJg6VNdZd/pDBJu0nvG2NlTg==", + "dev": true, + "license": "BSD-2-Clause", + "dependencies": { + "esrecurse": "^4.3.0", + "estraverse": "^5.2.0" + }, + "engines": { + "node": "^18.18.0 || ^20.9.0 || >=21.1.0" + }, + "funding": { + "url": "https://opencollective.com/eslint" + } + }, + "node_modules/eslint-visitor-keys": { + "version": "4.2.1", + "resolved": "https://registry.npmjs.org/eslint-visitor-keys/-/eslint-visitor-keys-4.2.1.tgz", + "integrity": "sha512-Uhdk5sfqcee/9H/rCOJikYz67o0a2Tw2hGRPOG2Y1R2dg7brRe1uG0yaNQDHu+TO/uQPF/5eCapvYSmHUjt7JQ==", + "dev": true, + "license": "Apache-2.0", + "engines": { + "node": "^18.18.0 || ^20.9.0 || >=21.1.0" + }, + "funding": { + "url": "https://opencollective.com/eslint" + } + }, + "node_modules/espree": { + "version": "10.4.0", + "resolved": "https://registry.npmjs.org/espree/-/espree-10.4.0.tgz", + "integrity": "sha512-j6PAQ2uUr79PZhBjP5C5fhl8e39FmRnOjsD5lGnWrFU8i2G776tBK7+nP8KuQUTTyAZUwfQqXAgrVH5MbH9CYQ==", + "dev": true, + "license": "BSD-2-Clause", + "dependencies": { + "acorn": "^8.15.0", + "acorn-jsx": "^5.3.2", + "eslint-visitor-keys": "^4.2.1" + }, + "engines": { + "node": "^18.18.0 || ^20.9.0 || >=21.1.0" + }, + "funding": { + "url": "https://opencollective.com/eslint" + } + }, + "node_modules/esquery": { + "version": "1.7.0", + "resolved": "https://registry.npmjs.org/esquery/-/esquery-1.7.0.tgz", + "integrity": "sha512-Ap6G0WQwcU/LHsvLwON1fAQX9Zp0A2Y6Y/cJBl9r/JbW90Zyg4/zbG6zzKa2OTALELarYHmKu0GhpM5EO+7T0g==", + "dev": true, + "license": "BSD-3-Clause", + "dependencies": { + "estraverse": "^5.1.0" + }, + "engines": { + "node": ">=0.10" + } + }, + "node_modules/esrecurse": { + "version": "4.3.0", + "resolved": "https://registry.npmjs.org/esrecurse/-/esrecurse-4.3.0.tgz", + "integrity": "sha512-KmfKL3b6G+RXvP8N1vr3Tq1kL/oCFgn2NYXEtqP8/L3pKapUA4G8cFVaoF3SU323CD4XypR/ffioHmkti6/Tag==", + "dev": true, + "license": "BSD-2-Clause", + "dependencies": { + "estraverse": "^5.2.0" + }, + "engines": { + "node": ">=4.0" + } + }, + "node_modules/estraverse": { + "version": "5.3.0", + "resolved": "https://registry.npmjs.org/estraverse/-/estraverse-5.3.0.tgz", + "integrity": "sha512-MMdARuVEQziNTeJD8DgMqmhwR11BRQ/cBP+pLtYdSTnf3MIO8fFeiINEbX36ZdNlfU/7A9f3gUw49B3oQsvwBA==", + "dev": true, + "license": "BSD-2-Clause", + "engines": { + "node": ">=4.0" + } + }, + "node_modules/estree-util-is-identifier-name": { + "version": "3.0.0", + "resolved": "https://registry.npmjs.org/estree-util-is-identifier-name/-/estree-util-is-identifier-name-3.0.0.tgz", + "integrity": "sha512-hFtqIDZTIUZ9BXLb8y4pYGyk6+wekIivNVTcmvk8NoOh+VeRn5y6cEHzbURrWbfp1fIqdVipilzj+lfaadNZmg==", + "license": "MIT", + "funding": { + "type": "opencollective", + "url": "https://opencollective.com/unified" + } + }, + "node_modules/esutils": { + "version": "2.0.3", + "resolved": "https://registry.npmjs.org/esutils/-/esutils-2.0.3.tgz", + "integrity": "sha512-kVscqXk4OCp68SZ0dkgEKVi6/8ij300KBWTJq32P/dYeWTSwK41WyTxalN1eRmA5Z9UU/LX9D7FWSmV9SAYx6g==", + "dev": true, + "license": "BSD-2-Clause", + "engines": { + "node": ">=0.10.0" + } + }, + "node_modules/extend": { + "version": "3.0.2", + "resolved": "https://registry.npmjs.org/extend/-/extend-3.0.2.tgz", + "integrity": "sha512-fjquC59cD7CyW6urNXK0FBufkZcoiGG80wTuPujX590cB5Ttln20E2UB4S/WARVqhXffZl2LNgS+gQdPIIim/g==", + "license": "MIT" + }, + "node_modules/fast-deep-equal": { + "version": "3.1.3", + "resolved": "https://registry.npmjs.org/fast-deep-equal/-/fast-deep-equal-3.1.3.tgz", + "integrity": "sha512-f3qQ9oQy9j2AhBe/H9VC91wLmKBCCU/gDOnKNAYG5hswO7BLKj09Hc5HYNz9cGI++xlpDCIgDaitVs03ATR84Q==", + "dev": true, + "license": "MIT" + }, + "node_modules/fast-json-stable-stringify": { + "version": "2.1.0", + "resolved": "https://registry.npmjs.org/fast-json-stable-stringify/-/fast-json-stable-stringify-2.1.0.tgz", + "integrity": "sha512-lhd/wF+Lk98HZoTCtlVraHtfh5XYijIjalXck7saUtuanSDyLMxnHhSXEDJqHxD7msR8D0uCmqlkwjCV8xvwHw==", + "dev": true, + "license": "MIT" + }, + "node_modules/fast-levenshtein": { + "version": "2.0.6", + "resolved": "https://registry.npmjs.org/fast-levenshtein/-/fast-levenshtein-2.0.6.tgz", + "integrity": "sha512-DCXu6Ifhqcks7TZKY3Hxp3y6qphY5SJZmrWMDrKcERSOXWQdMhU9Ig/PYrzyw/ul9jOIyh0N4M0tbC5hodg8dw==", + "dev": true, + "license": "MIT" + }, + "node_modules/fdir": { + "version": "6.5.0", + "resolved": "https://registry.npmjs.org/fdir/-/fdir-6.5.0.tgz", + "integrity": "sha512-tIbYtZbucOs0BRGqPJkshJUYdL+SDH7dVM8gjy+ERp3WAUjLEFJE+02kanyHtwjWOnwrKYBiwAmM0p4kLJAnXg==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">=12.0.0" + }, + "peerDependencies": { + "picomatch": "^3 || ^4" + }, + "peerDependenciesMeta": { + "picomatch": { + "optional": true + } + } + }, + "node_modules/file-entry-cache": { + "version": "8.0.0", + "resolved": "https://registry.npmjs.org/file-entry-cache/-/file-entry-cache-8.0.0.tgz", + "integrity": "sha512-XXTUwCvisa5oacNGRP9SfNtYBNAMi+RPwBFmblZEF7N7swHYQS6/Zfk7SRwx4D5j3CH211YNRco1DEMNVfZCnQ==", + "dev": true, + "license": "MIT", + "dependencies": { + "flat-cache": "^4.0.0" + }, + "engines": { + "node": ">=16.0.0" + } + }, + "node_modules/find-up": { + "version": "5.0.0", + "resolved": "https://registry.npmjs.org/find-up/-/find-up-5.0.0.tgz", + "integrity": "sha512-78/PXT1wlLLDgTzDs7sjq9hzz0vXD+zn+7wypEe4fXQxCmdmqfGsEPQxmiCSQI3ajFV91bVSsvNtrJRiW6nGng==", + "dev": true, + "license": "MIT", + "dependencies": { + "locate-path": "^6.0.0", + "path-exists": "^4.0.0" + }, + "engines": { + "node": ">=10" + }, + "funding": { + "url": "https://github.com/sponsors/sindresorhus" + } + }, + "node_modules/flat-cache": { + "version": "4.0.1", + "resolved": "https://registry.npmjs.org/flat-cache/-/flat-cache-4.0.1.tgz", + "integrity": "sha512-f7ccFPK3SXFHpx15UIGyRJ/FJQctuKZ0zVuN3frBo4HnK3cay9VEW0R6yPYFHC0AgqhukPzKjq22t5DmAyqGyw==", + "dev": true, + "license": "MIT", + "dependencies": { + "flatted": "^3.2.9", + "keyv": "^4.5.4" + }, + "engines": { + "node": ">=16" + } + }, + "node_modules/flatted": { + "version": "3.3.3", + "resolved": "https://registry.npmjs.org/flatted/-/flatted-3.3.3.tgz", + "integrity": "sha512-GX+ysw4PBCz0PzosHDepZGANEuFCMLrnRTiEy9McGjmkCQYwRq4A/X786G/fjM/+OjsWSU1ZrY5qyARZmO/uwg==", + "dev": true, + "license": "ISC" + }, + "node_modules/for-each": { + "version": "0.3.5", + "resolved": "https://registry.npmjs.org/for-each/-/for-each-0.3.5.tgz", + "integrity": "sha512-dKx12eRCVIzqCxFGplyFKJMPvLEWgmNtUrpTiJIR5u97zEhRG8ySrtboPHZXx7daLxQVrl643cTzbab2tkQjxg==", + "dev": true, + "license": "MIT", + "dependencies": { + "is-callable": "^1.2.7" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/fraction.js": { + "version": "5.3.4", + "resolved": "https://registry.npmjs.org/fraction.js/-/fraction.js-5.3.4.tgz", + "integrity": "sha512-1X1NTtiJphryn/uLQz3whtY6jK3fTqoE3ohKs0tT+Ujr1W59oopxmoEh7Lu5p6vBaPbgoM0bzveAW4Qi5RyWDQ==", + "dev": true, + "license": "MIT", + "engines": { + "node": "*" + }, + "funding": { + "type": "github", + "url": "https://github.com/sponsors/rawify" + } + }, + "node_modules/fsevents": { + "version": "2.3.3", + "resolved": "https://registry.npmjs.org/fsevents/-/fsevents-2.3.3.tgz", + "integrity": "sha512-5xoDfX+fL7faATnagmWPpbFtwh/R77WmMMqqHGS65C3vvB0YHrgF+B1YmZ3441tMj5n63k0212XNoJwzlhffQw==", + "dev": true, + "hasInstallScript": true, + "license": "MIT", + "optional": true, + "os": [ + "darwin" + ], + "engines": { + "node": "^8.16.0 || ^10.6.0 || >=11.0.0" + } + }, + "node_modules/function-bind": { + "version": "1.1.2", + "resolved": "https://registry.npmjs.org/function-bind/-/function-bind-1.1.2.tgz", + "integrity": "sha512-7XHNxH7qX9xG5mIwxkhumTox/MIRNcOgDrxWsMt2pAr23WHp6MrRlN7FBSFpCpr+oVO0F744iUgR82nJMfG2SA==", + "dev": true, + "license": "MIT", + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/function.prototype.name": { + "version": "1.1.8", + "resolved": "https://registry.npmjs.org/function.prototype.name/-/function.prototype.name-1.1.8.tgz", + "integrity": "sha512-e5iwyodOHhbMr/yNrc7fDYG4qlbIvI5gajyzPnb5TCwyhjApznQh1BMFou9b30SevY43gCJKXycoCBjMbsuW0Q==", + "dev": true, + "license": "MIT", + "dependencies": { + "call-bind": "^1.0.8", + "call-bound": "^1.0.3", + "define-properties": "^1.2.1", + "functions-have-names": "^1.2.3", + "hasown": "^2.0.2", + "is-callable": "^1.2.7" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/functions-have-names": { + "version": "1.2.3", + "resolved": "https://registry.npmjs.org/functions-have-names/-/functions-have-names-1.2.3.tgz", + "integrity": "sha512-xckBUXyTIqT97tq2x2AMb+g163b5JFysYk0x4qxNFwbfQkmNZoiRHb6sPzI9/QV33WeuvVYBUIiD4NzNIyqaRQ==", + "dev": true, + "license": "MIT", + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/generator-function": { + "version": "2.0.1", + "resolved": "https://registry.npmjs.org/generator-function/-/generator-function-2.0.1.tgz", + "integrity": "sha512-SFdFmIJi+ybC0vjlHN0ZGVGHc3lgE0DxPAT0djjVg+kjOnSqclqmj0KQ7ykTOLP6YxoqOvuAODGdcHJn+43q3g==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">= 0.4" + } + }, + "node_modules/gensync": { + "version": "1.0.0-beta.2", + "resolved": "https://registry.npmjs.org/gensync/-/gensync-1.0.0-beta.2.tgz", + "integrity": "sha512-3hN7NaskYvMDLQY55gnW3NQ+mesEAepTqlg+VEbj7zzqEMBVNhzcGYYeqFo/TlYz6eQiFcp1HcsCZO+nGgS8zg==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">=6.9.0" + } + }, + "node_modules/get-intrinsic": { + "version": "1.3.0", + "resolved": "https://registry.npmjs.org/get-intrinsic/-/get-intrinsic-1.3.0.tgz", + "integrity": "sha512-9fSjSaos/fRIVIp+xSJlE6lfwhES7LNtKaCBIamHsjr2na1BiABJPo0mOjjz8GJDURarmCPGqaiVg5mfjb98CQ==", + "dev": true, + "license": "MIT", + "dependencies": { + "call-bind-apply-helpers": "^1.0.2", + "es-define-property": "^1.0.1", + "es-errors": "^1.3.0", + "es-object-atoms": "^1.1.1", + "function-bind": "^1.1.2", + "get-proto": "^1.0.1", + "gopd": "^1.2.0", + "has-symbols": "^1.1.0", + "hasown": "^2.0.2", + "math-intrinsics": "^1.1.0" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/get-proto": { + "version": "1.0.1", + "resolved": "https://registry.npmjs.org/get-proto/-/get-proto-1.0.1.tgz", + "integrity": "sha512-sTSfBjoXBp89JvIKIefqw7U2CCebsc74kiY6awiGogKtoSGbgjYE/G/+l9sF3MWFPNc9IcoOC4ODfKHfxFmp0g==", + "dev": true, + "license": "MIT", + "dependencies": { + "dunder-proto": "^1.0.1", + "es-object-atoms": "^1.0.0" + }, + "engines": { + "node": ">= 0.4" + } + }, + "node_modules/get-symbol-description": { + "version": "1.1.0", + "resolved": "https://registry.npmjs.org/get-symbol-description/-/get-symbol-description-1.1.0.tgz", + "integrity": "sha512-w9UMqWwJxHNOvoNzSJ2oPF5wvYcvP7jUvYzhp67yEhTi17ZDBBC1z9pTdGuzjD+EFIqLSYRweZjqfiPzQ06Ebg==", + "dev": true, + "license": "MIT", + "dependencies": { + "call-bound": "^1.0.3", + "es-errors": "^1.3.0", + "get-intrinsic": "^1.2.6" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/glob-parent": { + "version": "6.0.2", + "resolved": "https://registry.npmjs.org/glob-parent/-/glob-parent-6.0.2.tgz", + "integrity": "sha512-XxwI8EOhVQgWp6iDL+3b0r86f4d6AX6zSU55HfB4ydCEuXLXc5FcYeOu+nnGftS4TEju/11rt4KJPTMgbfmv4A==", + "dev": true, + "license": "ISC", + "dependencies": { + "is-glob": "^4.0.3" + }, + "engines": { + "node": ">=10.13.0" + } + }, + "node_modules/globals": { + "version": "14.0.0", + "resolved": "https://registry.npmjs.org/globals/-/globals-14.0.0.tgz", + "integrity": "sha512-oahGvuMGQlPw/ivIYBjVSrWAfWLBeku5tpPE2fOPLi+WHffIWbuh2tCjhyQhTBPMf5E9jDEH4FOmTYgYwbKwtQ==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">=18" + }, + "funding": { + "url": "https://github.com/sponsors/sindresorhus" + } + }, + "node_modules/globalthis": { + "version": "1.0.4", + "resolved": "https://registry.npmjs.org/globalthis/-/globalthis-1.0.4.tgz", + "integrity": "sha512-DpLKbNU4WylpxJykQujfCcwYWiV/Jhm50Goo0wrVILAv5jOr9d+H+UR3PhSCD2rCCEIg0uc+G+muBTwD54JhDQ==", + "dev": true, + "license": "MIT", + "dependencies": { + "define-properties": "^1.2.1", + "gopd": "^1.0.1" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/gopd": { + "version": "1.2.0", + "resolved": "https://registry.npmjs.org/gopd/-/gopd-1.2.0.tgz", + "integrity": "sha512-ZUKRh6/kUFoAiTAtTYPZJ3hw9wNxx+BIBOijnlG9PnrJsCcSjs1wyyD6vJpaYtgnzDrKYRSqf3OO6Rfa93xsRg==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/graceful-fs": { + "version": "4.2.11", + "resolved": "https://registry.npmjs.org/graceful-fs/-/graceful-fs-4.2.11.tgz", + "integrity": "sha512-RbJ5/jmFcNNCcDV5o9eTnBLJ/HszWV0P73bc+Ff4nS/rJj+YaS6IGyiOL0VoBYX+l1Wrl3k63h/KrH+nhJ0XvQ==", + "dev": true, + "license": "ISC" + }, + "node_modules/has-bigints": { + "version": "1.1.0", + "resolved": "https://registry.npmjs.org/has-bigints/-/has-bigints-1.1.0.tgz", + "integrity": "sha512-R3pbpkcIqv2Pm3dUwgjclDRVmWpTJW2DcMzcIhEXEx1oh/CEMObMm3KLmRJOdvhM7o4uQBnwr8pzRK2sJWIqfg==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/has-flag": { + "version": "4.0.0", + "resolved": "https://registry.npmjs.org/has-flag/-/has-flag-4.0.0.tgz", + "integrity": "sha512-EykJT/Q1KjTWctppgIAgfSO0tKVuZUjhgMr17kqTumMl6Afv3EISleU7qZUzoXDFTAHTDC4NOoG/ZxU3EvlMPQ==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">=8" + } + }, + "node_modules/has-property-descriptors": { + "version": "1.0.2", + "resolved": "https://registry.npmjs.org/has-property-descriptors/-/has-property-descriptors-1.0.2.tgz", + "integrity": "sha512-55JNKuIW+vq4Ke1BjOTjM2YctQIvCT7GFzHwmfZPGo5wnrgkid0YQtnAleFSqumZm4az3n2BS+erby5ipJdgrg==", + "dev": true, + "license": "MIT", + "dependencies": { + "es-define-property": "^1.0.0" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/has-proto": { + "version": "1.2.0", + "resolved": "https://registry.npmjs.org/has-proto/-/has-proto-1.2.0.tgz", + "integrity": "sha512-KIL7eQPfHQRC8+XluaIw7BHUwwqL19bQn4hzNgdr+1wXoU0KKj6rufu47lhY7KbJR2C6T6+PfyN0Ea7wkSS+qQ==", + "dev": true, + "license": "MIT", + "dependencies": { + "dunder-proto": "^1.0.0" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/has-symbols": { + "version": "1.1.0", + "resolved": "https://registry.npmjs.org/has-symbols/-/has-symbols-1.1.0.tgz", + "integrity": "sha512-1cDNdwJ2Jaohmb3sg4OmKaMBwuC48sYni5HUw2DvsC8LjGTLK9h+eb1X6RyuOHe4hT0ULCW68iomhjUoKUqlPQ==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/has-tostringtag": { + "version": "1.0.2", + "resolved": "https://registry.npmjs.org/has-tostringtag/-/has-tostringtag-1.0.2.tgz", + "integrity": "sha512-NqADB8VjPFLM2V0VvHUewwwsw0ZWBaIdgo+ieHtK3hasLz4qeCRjYcqfB6AQrBggRKppKF8L52/VqdVsO47Dlw==", + "dev": true, + "license": "MIT", + "dependencies": { + "has-symbols": "^1.0.3" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/hasown": { + "version": "2.0.2", + "resolved": "https://registry.npmjs.org/hasown/-/hasown-2.0.2.tgz", + "integrity": "sha512-0hJU9SCPvmMzIBdZFqNPXWa6dqh7WdH0cII9y+CyS8rG3nL48Bclra9HmKhVVUHyPWNH5Y7xDwAB7bfgSjkUMQ==", + "dev": true, + "license": "MIT", + "dependencies": { + "function-bind": "^1.1.2" + }, + "engines": { + "node": ">= 0.4" + } + }, + "node_modules/hast-util-to-jsx-runtime": { + "version": "2.3.6", + "resolved": "https://registry.npmjs.org/hast-util-to-jsx-runtime/-/hast-util-to-jsx-runtime-2.3.6.tgz", + "integrity": "sha512-zl6s8LwNyo1P9uw+XJGvZtdFF1GdAkOg8ujOw+4Pyb76874fLps4ueHXDhXWdk6YHQ6OgUtinliG7RsYvCbbBg==", + "license": "MIT", + "dependencies": { + "@types/estree": "^1.0.0", + "@types/hast": "^3.0.0", + "@types/unist": "^3.0.0", + "comma-separated-tokens": "^2.0.0", + "devlop": "^1.0.0", + "estree-util-is-identifier-name": "^3.0.0", + "hast-util-whitespace": "^3.0.0", + "mdast-util-mdx-expression": "^2.0.0", + "mdast-util-mdx-jsx": "^3.0.0", + "mdast-util-mdxjs-esm": "^2.0.0", + "property-information": "^7.0.0", + "space-separated-tokens": "^2.0.0", + "style-to-js": "^1.0.0", + "unist-util-position": "^5.0.0", + "vfile-message": "^4.0.0" + }, + "funding": { + "type": "opencollective", + "url": "https://opencollective.com/unified" + } + }, + "node_modules/hast-util-whitespace": { + "version": "3.0.0", + "resolved": "https://registry.npmjs.org/hast-util-whitespace/-/hast-util-whitespace-3.0.0.tgz", + "integrity": "sha512-88JUN06ipLwsnv+dVn+OIYOvAuvBMy/Qoi6O7mQHxdPXpjy+Cd6xRkWwux7DKO+4sYILtLBRIKgsdpS2gQc7qw==", + "license": "MIT", + "dependencies": { + "@types/hast": "^3.0.0" + }, + "funding": { + "type": "opencollective", + "url": "https://opencollective.com/unified" + } + }, + "node_modules/html-url-attributes": { + "version": "3.0.1", + "resolved": "https://registry.npmjs.org/html-url-attributes/-/html-url-attributes-3.0.1.tgz", + "integrity": "sha512-ol6UPyBWqsrO6EJySPz2O7ZSr856WDrEzM5zMqp+FJJLGMW35cLYmmZnl0vztAZxRUoNZJFTCohfjuIJ8I4QBQ==", + "license": "MIT", + "funding": { + "type": "opencollective", + "url": "https://opencollective.com/unified" + } + }, + "node_modules/ignore": { + "version": "5.3.2", + "resolved": "https://registry.npmjs.org/ignore/-/ignore-5.3.2.tgz", + "integrity": "sha512-hsBTNUqQTDwkWtcdYI2i06Y/nUBEsNEDJKjWdigLvegy8kDuJAS8uRlpkkcQpyEXL0Z/pjDy5HBmMjRCJ2gq+g==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">= 4" + } + }, + "node_modules/import-fresh": { + "version": "3.3.1", + "resolved": "https://registry.npmjs.org/import-fresh/-/import-fresh-3.3.1.tgz", + "integrity": "sha512-TR3KfrTZTYLPB6jUjfx6MF9WcWrHL9su5TObK4ZkYgBdWKPOFoSoQIdEuTuR82pmtxH2spWG9h6etwfr1pLBqQ==", + "dev": true, + "license": "MIT", + "dependencies": { + "parent-module": "^1.0.0", + "resolve-from": "^4.0.0" + }, + "engines": { + "node": ">=6" + }, + "funding": { + "url": "https://github.com/sponsors/sindresorhus" + } + }, + "node_modules/imurmurhash": { + "version": "0.1.4", + "resolved": "https://registry.npmjs.org/imurmurhash/-/imurmurhash-0.1.4.tgz", + "integrity": "sha512-JmXMZ6wuvDmLiHEml9ykzqO6lwFbof0GG4IkcGaENdCRDDmMVnny7s5HsIgHCbaq0w2MyPhDqkhTUgS2LU2PHA==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">=0.8.19" + } + }, + "node_modules/inline-style-parser": { + "version": "0.2.7", + "resolved": "https://registry.npmjs.org/inline-style-parser/-/inline-style-parser-0.2.7.tgz", + "integrity": "sha512-Nb2ctOyNR8DqQoR0OwRG95uNWIC0C1lCgf5Naz5H6Ji72KZ8OcFZLz2P5sNgwlyoJ8Yif11oMuYs5pBQa86csA==", + "license": "MIT" + }, + "node_modules/internal-slot": { + "version": "1.1.0", + "resolved": "https://registry.npmjs.org/internal-slot/-/internal-slot-1.1.0.tgz", + "integrity": "sha512-4gd7VpWNQNB4UKKCFFVcp1AVv+FMOgs9NKzjHKusc8jTMhd5eL1NqQqOpE0KzMds804/yHlglp3uxgluOqAPLw==", + "dev": true, + "license": "MIT", + "dependencies": { + "es-errors": "^1.3.0", + "hasown": "^2.0.2", + "side-channel": "^1.1.0" + }, + "engines": { + "node": ">= 0.4" + } + }, + "node_modules/is-alphabetical": { + "version": "2.0.1", + "resolved": "https://registry.npmjs.org/is-alphabetical/-/is-alphabetical-2.0.1.tgz", + "integrity": "sha512-FWyyY60MeTNyeSRpkM2Iry0G9hpr7/9kD40mD/cGQEuilcZYS4okz8SN2Q6rLCJ8gbCt6fN+rC+6tMGS99LaxQ==", + "license": "MIT", + "funding": { + "type": "github", + "url": "https://github.com/sponsors/wooorm" + } + }, + "node_modules/is-alphanumerical": { + "version": "2.0.1", + "resolved": "https://registry.npmjs.org/is-alphanumerical/-/is-alphanumerical-2.0.1.tgz", + "integrity": "sha512-hmbYhX/9MUMF5uh7tOXyK/n0ZvWpad5caBA17GsC6vyuCqaWliRG5K1qS9inmUhEMaOBIW7/whAnSwveW/LtZw==", + "license": "MIT", + "dependencies": { + "is-alphabetical": "^2.0.0", + "is-decimal": "^2.0.0" + }, + "funding": { + "type": "github", + "url": "https://github.com/sponsors/wooorm" + } + }, + "node_modules/is-array-buffer": { + "version": "3.0.5", + "resolved": "https://registry.npmjs.org/is-array-buffer/-/is-array-buffer-3.0.5.tgz", + "integrity": "sha512-DDfANUiiG2wC1qawP66qlTugJeL5HyzMpfr8lLK+jMQirGzNod0B12cFB/9q838Ru27sBwfw78/rdoU7RERz6A==", + "dev": true, + "license": "MIT", + "dependencies": { + "call-bind": "^1.0.8", + "call-bound": "^1.0.3", + "get-intrinsic": "^1.2.6" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/is-async-function": { + "version": "2.1.1", + "resolved": "https://registry.npmjs.org/is-async-function/-/is-async-function-2.1.1.tgz", + "integrity": "sha512-9dgM/cZBnNvjzaMYHVoxxfPj2QXt22Ev7SuuPrs+xav0ukGB0S6d4ydZdEiM48kLx5kDV+QBPrpVnFyefL8kkQ==", + "dev": true, + "license": "MIT", + "dependencies": { + "async-function": "^1.0.0", + "call-bound": "^1.0.3", + "get-proto": "^1.0.1", + "has-tostringtag": "^1.0.2", + "safe-regex-test": "^1.1.0" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/is-bigint": { + "version": "1.1.0", + "resolved": "https://registry.npmjs.org/is-bigint/-/is-bigint-1.1.0.tgz", + "integrity": "sha512-n4ZT37wG78iz03xPRKJrHTdZbe3IicyucEtdRsV5yglwc3GyUfbAfpSeD0FJ41NbUNSt5wbhqfp1fS+BgnvDFQ==", + "dev": true, + "license": "MIT", + "dependencies": { + "has-bigints": "^1.0.2" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/is-boolean-object": { + "version": "1.2.2", + "resolved": "https://registry.npmjs.org/is-boolean-object/-/is-boolean-object-1.2.2.tgz", + "integrity": "sha512-wa56o2/ElJMYqjCjGkXri7it5FbebW5usLw/nPmCMs5DeZ7eziSYZhSmPRn0txqeW4LnAmQQU7FgqLpsEFKM4A==", + "dev": true, + "license": "MIT", + "dependencies": { + "call-bound": "^1.0.3", + "has-tostringtag": "^1.0.2" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/is-callable": { + "version": "1.2.7", + "resolved": "https://registry.npmjs.org/is-callable/-/is-callable-1.2.7.tgz", + "integrity": "sha512-1BC0BVFhS/p0qtw6enp8e+8OD0UrK0oFLztSjNzhcKA3WDuJxxAPXzPuPtKkjEY9UUoEWlX/8fgKeu2S8i9JTA==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/is-core-module": { + "version": "2.16.1", + "resolved": "https://registry.npmjs.org/is-core-module/-/is-core-module-2.16.1.tgz", + "integrity": "sha512-UfoeMA6fIJ8wTYFEUjelnaGI67v6+N7qXJEvQuIGa99l4xsCruSYOVSQ0uPANn4dAzm8lkYPaKLrrijLq7x23w==", + "dev": true, + "license": "MIT", + "dependencies": { + "hasown": "^2.0.2" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/is-data-view": { + "version": "1.0.2", + "resolved": "https://registry.npmjs.org/is-data-view/-/is-data-view-1.0.2.tgz", + "integrity": "sha512-RKtWF8pGmS87i2D6gqQu/l7EYRlVdfzemCJN/P3UOs//x1QE7mfhvzHIApBTRf7axvT6DMGwSwBXYCT0nfB9xw==", + "dev": true, + "license": "MIT", + "dependencies": { + "call-bound": "^1.0.2", + "get-intrinsic": "^1.2.6", + "is-typed-array": "^1.1.13" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/is-date-object": { + "version": "1.1.0", + "resolved": "https://registry.npmjs.org/is-date-object/-/is-date-object-1.1.0.tgz", + "integrity": "sha512-PwwhEakHVKTdRNVOw+/Gyh0+MzlCl4R6qKvkhuvLtPMggI1WAHt9sOwZxQLSGpUaDnrdyDsomoRgNnCfKNSXXg==", + "dev": true, + "license": "MIT", + "dependencies": { + "call-bound": "^1.0.2", + "has-tostringtag": "^1.0.2" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/is-decimal": { + "version": "2.0.1", + "resolved": "https://registry.npmjs.org/is-decimal/-/is-decimal-2.0.1.tgz", + "integrity": "sha512-AAB9hiomQs5DXWcRB1rqsxGUstbRroFOPPVAomNk/3XHR5JyEZChOyTWe2oayKnsSsr/kcGqF+z6yuH6HHpN0A==", + "license": "MIT", + "funding": { + "type": "github", + "url": "https://github.com/sponsors/wooorm" + } + }, + "node_modules/is-extglob": { + "version": "2.1.1", + "resolved": "https://registry.npmjs.org/is-extglob/-/is-extglob-2.1.1.tgz", + "integrity": "sha512-SbKbANkN603Vi4jEZv49LeVJMn4yGwsbzZworEoyEiutsN3nJYdbO36zfhGJ6QEDpOZIFkDtnq5JRxmvl3jsoQ==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">=0.10.0" + } + }, + "node_modules/is-finalizationregistry": { + "version": "1.1.1", + "resolved": "https://registry.npmjs.org/is-finalizationregistry/-/is-finalizationregistry-1.1.1.tgz", + "integrity": "sha512-1pC6N8qWJbWoPtEjgcL2xyhQOP491EQjeUo3qTKcmV8YSDDJrOepfG8pcC7h/QgnQHYSv0mJ3Z/ZWxmatVrysg==", + "dev": true, + "license": "MIT", + "dependencies": { + "call-bound": "^1.0.3" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/is-generator-function": { + "version": "1.1.2", + "resolved": "https://registry.npmjs.org/is-generator-function/-/is-generator-function-1.1.2.tgz", + "integrity": "sha512-upqt1SkGkODW9tsGNG5mtXTXtECizwtS2kA161M+gJPc1xdb/Ax629af6YrTwcOeQHbewrPNlE5Dx7kzvXTizA==", + "dev": true, + "license": "MIT", + "dependencies": { + "call-bound": "^1.0.4", + "generator-function": "^2.0.0", + "get-proto": "^1.0.1", + "has-tostringtag": "^1.0.2", + "safe-regex-test": "^1.1.0" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/is-glob": { + "version": "4.0.3", + "resolved": "https://registry.npmjs.org/is-glob/-/is-glob-4.0.3.tgz", + "integrity": "sha512-xelSayHH36ZgE7ZWhli7pW34hNbNl8Ojv5KVmkJD4hBdD3th8Tfk9vYasLM+mXWOZhFkgZfxhLSnrwRr4elSSg==", + "dev": true, + "license": "MIT", + "dependencies": { + "is-extglob": "^2.1.1" + }, + "engines": { + "node": ">=0.10.0" + } + }, + "node_modules/is-hexadecimal": { + "version": "2.0.1", + "resolved": "https://registry.npmjs.org/is-hexadecimal/-/is-hexadecimal-2.0.1.tgz", + "integrity": "sha512-DgZQp241c8oO6cA1SbTEWiXeoxV42vlcJxgH+B3hi1AiqqKruZR3ZGF8In3fj4+/y/7rHvlOZLZtgJ/4ttYGZg==", + "license": "MIT", + "funding": { + "type": "github", + "url": "https://github.com/sponsors/wooorm" + } + }, + "node_modules/is-map": { + "version": "2.0.3", + "resolved": "https://registry.npmjs.org/is-map/-/is-map-2.0.3.tgz", + "integrity": "sha512-1Qed0/Hr2m+YqxnM09CjA2d/i6YZNfF6R2oRAOj36eUdS6qIV/huPJNSEpKbupewFs+ZsJlxsjjPbc0/afW6Lw==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/is-negative-zero": { + "version": "2.0.3", + "resolved": "https://registry.npmjs.org/is-negative-zero/-/is-negative-zero-2.0.3.tgz", + "integrity": "sha512-5KoIu2Ngpyek75jXodFvnafB6DJgr3u8uuK0LEZJjrU19DrMD3EVERaR8sjz8CCGgpZvxPl9SuE1GMVPFHx1mw==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/is-number-object": { + "version": "1.1.1", + "resolved": "https://registry.npmjs.org/is-number-object/-/is-number-object-1.1.1.tgz", + "integrity": "sha512-lZhclumE1G6VYD8VHe35wFaIif+CTy5SJIi5+3y4psDgWu4wPDoBhF8NxUOinEc7pHgiTsT6MaBb92rKhhD+Xw==", + "dev": true, + "license": "MIT", + "dependencies": { + "call-bound": "^1.0.3", + "has-tostringtag": "^1.0.2" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/is-plain-obj": { + "version": "4.1.0", + "resolved": "https://registry.npmjs.org/is-plain-obj/-/is-plain-obj-4.1.0.tgz", + "integrity": "sha512-+Pgi+vMuUNkJyExiMBt5IlFoMyKnr5zhJ4Uspz58WOhBF5QoIZkFyNHIbBAtHwzVAgk5RtndVNsDRN61/mmDqg==", + "license": "MIT", + "engines": { + "node": ">=12" + }, + "funding": { + "url": "https://github.com/sponsors/sindresorhus" + } + }, + "node_modules/is-regex": { + "version": "1.2.1", + "resolved": "https://registry.npmjs.org/is-regex/-/is-regex-1.2.1.tgz", + "integrity": "sha512-MjYsKHO5O7mCsmRGxWcLWheFqN9DJ/2TmngvjKXihe6efViPqc274+Fx/4fYj/r03+ESvBdTXK0V6tA3rgez1g==", + "dev": true, + "license": "MIT", + "dependencies": { + "call-bound": "^1.0.2", + "gopd": "^1.2.0", + "has-tostringtag": "^1.0.2", + "hasown": "^2.0.2" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/is-set": { + "version": "2.0.3", + "resolved": "https://registry.npmjs.org/is-set/-/is-set-2.0.3.tgz", + "integrity": "sha512-iPAjerrse27/ygGLxw+EBR9agv9Y6uLeYVJMu+QNCoouJ1/1ri0mGrcWpfCqFZuzzx3WjtwxG098X+n4OuRkPg==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/is-shared-array-buffer": { + "version": "1.0.4", + "resolved": "https://registry.npmjs.org/is-shared-array-buffer/-/is-shared-array-buffer-1.0.4.tgz", + "integrity": "sha512-ISWac8drv4ZGfwKl5slpHG9OwPNty4jOWPRIhBpxOoD+hqITiwuipOQ2bNthAzwA3B4fIjO4Nln74N0S9byq8A==", + "dev": true, + "license": "MIT", + "dependencies": { + "call-bound": "^1.0.3" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/is-string": { + "version": "1.1.1", + "resolved": "https://registry.npmjs.org/is-string/-/is-string-1.1.1.tgz", + "integrity": "sha512-BtEeSsoaQjlSPBemMQIrY1MY0uM6vnS1g5fmufYOtnxLGUZM2178PKbhsk7Ffv58IX+ZtcvoGwccYsh0PglkAA==", + "dev": true, + "license": "MIT", + "dependencies": { + "call-bound": "^1.0.3", + "has-tostringtag": "^1.0.2" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/is-symbol": { + "version": "1.1.1", + "resolved": "https://registry.npmjs.org/is-symbol/-/is-symbol-1.1.1.tgz", + "integrity": "sha512-9gGx6GTtCQM73BgmHQXfDmLtfjjTUDSyoxTCbp5WtoixAhfgsDirWIcVQ/IHpvI5Vgd5i/J5F7B9cN/WlVbC/w==", + "dev": true, + "license": "MIT", + "dependencies": { + "call-bound": "^1.0.2", + "has-symbols": "^1.1.0", + "safe-regex-test": "^1.1.0" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/is-typed-array": { + "version": "1.1.15", + "resolved": "https://registry.npmjs.org/is-typed-array/-/is-typed-array-1.1.15.tgz", + "integrity": "sha512-p3EcsicXjit7SaskXHs1hA91QxgTw46Fv6EFKKGS5DRFLD8yKnohjF3hxoju94b/OcMZoQukzpPpBE9uLVKzgQ==", + "dev": true, + "license": "MIT", + "dependencies": { + "which-typed-array": "^1.1.16" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/is-weakmap": { + "version": "2.0.2", + "resolved": "https://registry.npmjs.org/is-weakmap/-/is-weakmap-2.0.2.tgz", + "integrity": "sha512-K5pXYOm9wqY1RgjpL3YTkF39tni1XajUIkawTLUo9EZEVUFga5gSQJF8nNS7ZwJQ02y+1YCNYcMh+HIf1ZqE+w==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/is-weakref": { + "version": "1.1.1", + "resolved": "https://registry.npmjs.org/is-weakref/-/is-weakref-1.1.1.tgz", + "integrity": "sha512-6i9mGWSlqzNMEqpCp93KwRS1uUOodk2OJ6b+sq7ZPDSy2WuI5NFIxp/254TytR8ftefexkWn5xNiHUNpPOfSew==", + "dev": true, + "license": "MIT", + "dependencies": { + "call-bound": "^1.0.3" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/is-weakset": { + "version": "2.0.4", + "resolved": "https://registry.npmjs.org/is-weakset/-/is-weakset-2.0.4.tgz", + "integrity": "sha512-mfcwb6IzQyOKTs84CQMrOwW4gQcaTOAWJ0zzJCl2WSPDrWk/OzDaImWFH3djXhb24g4eudZfLRozAvPGw4d9hQ==", + "dev": true, + "license": "MIT", + "dependencies": { + "call-bound": "^1.0.3", + "get-intrinsic": "^1.2.6" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/isarray": { + "version": "2.0.5", + "resolved": "https://registry.npmjs.org/isarray/-/isarray-2.0.5.tgz", + "integrity": "sha512-xHjhDr3cNBK0BzdUJSPXZntQUx/mwMS5Rw4A7lPJ90XGAO6ISP/ePDNuo0vhqOZU+UD5JoodwCAAoZQd3FeAKw==", + "dev": true, + "license": "MIT" + }, + "node_modules/isexe": { + "version": "2.0.0", + "resolved": "https://registry.npmjs.org/isexe/-/isexe-2.0.0.tgz", + "integrity": "sha512-RHxMLp9lnKHGHRng9QFhRCMbYAcVpn69smSGcq3f36xjgVVWThj4qqLbTLlq7Ssj8B+fIQ1EuCEGI2lKsyQeIw==", + "dev": true, + "license": "ISC" + }, + "node_modules/iterator.prototype": { + "version": "1.1.5", + "resolved": "https://registry.npmjs.org/iterator.prototype/-/iterator.prototype-1.1.5.tgz", + "integrity": "sha512-H0dkQoCa3b2VEeKQBOxFph+JAbcrQdE7KC0UkqwpLmv2EC4P41QXP+rqo9wYodACiG5/WM5s9oDApTU8utwj9g==", + "dev": true, + "license": "MIT", + "dependencies": { + "define-data-property": "^1.1.4", + "es-object-atoms": "^1.0.0", + "get-intrinsic": "^1.2.6", + "get-proto": "^1.0.0", + "has-symbols": "^1.1.0", + "set-function-name": "^2.0.2" + }, + "engines": { + "node": ">= 0.4" + } + }, + "node_modules/jiti": { + "version": "2.6.1", + "resolved": "https://registry.npmjs.org/jiti/-/jiti-2.6.1.tgz", + "integrity": "sha512-ekilCSN1jwRvIbgeg/57YFh8qQDNbwDb9xT/qu2DAHbFFZUicIl4ygVaAvzveMhMVr3LnpSKTNnwt8PoOfmKhQ==", + "dev": true, + "license": "MIT", + "bin": { + "jiti": "lib/jiti-cli.mjs" + } + }, + "node_modules/js-tokens": { + "version": "4.0.0", + "resolved": "https://registry.npmjs.org/js-tokens/-/js-tokens-4.0.0.tgz", + "integrity": "sha512-RdJUflcE3cUzKiMqQgsCu06FPu9UdIJO0beYbPhHN4k6apgJtifcoCtT9bcxOpYBtpD2kCM6Sbzg4CausW/PKQ==", + "license": "MIT" + }, + "node_modules/js-yaml": { + "version": "4.1.1", + "resolved": "https://registry.npmjs.org/js-yaml/-/js-yaml-4.1.1.tgz", + "integrity": "sha512-qQKT4zQxXl8lLwBtHMWwaTcGfFOZviOJet3Oy/xmGk2gZH677CJM9EvtfdSkgWcATZhj/55JZ0rmy3myCT5lsA==", + "dev": true, + "license": "MIT", + "dependencies": { + "argparse": "^2.0.1" + }, + "bin": { + "js-yaml": "bin/js-yaml.js" + } + }, + "node_modules/jsesc": { + "version": "3.1.0", + "resolved": "https://registry.npmjs.org/jsesc/-/jsesc-3.1.0.tgz", + "integrity": "sha512-/sM3dO2FOzXjKQhJuo0Q173wf2KOo8t4I8vHy6lF9poUp7bKT0/NHE8fPX23PwfhnykfqnC2xRxOnVw5XuGIaA==", + "dev": true, + "license": "MIT", + "bin": { + "jsesc": "bin/jsesc" + }, + "engines": { + "node": ">=6" + } + }, + "node_modules/json-buffer": { + "version": "3.0.1", + "resolved": "https://registry.npmjs.org/json-buffer/-/json-buffer-3.0.1.tgz", + "integrity": "sha512-4bV5BfR2mqfQTJm+V5tPPdf+ZpuhiIvTuAB5g8kcrXOZpTT/QwwVRWBywX1ozr6lEuPdbHxwaJlm9G6mI2sfSQ==", + "dev": true, + "license": "MIT" + }, + "node_modules/json-schema-traverse": { + "version": "0.4.1", + "resolved": "https://registry.npmjs.org/json-schema-traverse/-/json-schema-traverse-0.4.1.tgz", + "integrity": "sha512-xbbCH5dCYU5T8LcEhhuh7HJ88HXuW3qsI3Y0zOZFKfZEHcpWiHU/Jxzk629Brsab/mMiHQti9wMP+845RPe3Vg==", + "dev": true, + "license": "MIT" + }, + "node_modules/json-stable-stringify-without-jsonify": { + "version": "1.0.1", + "resolved": "https://registry.npmjs.org/json-stable-stringify-without-jsonify/-/json-stable-stringify-without-jsonify-1.0.1.tgz", + "integrity": "sha512-Bdboy+l7tA3OGW6FjyFHWkP5LuByj1Tk33Ljyq0axyzdk9//JSi2u3fP1QSmd1KNwq6VOKYGlAu87CisVir6Pw==", + "dev": true, + "license": "MIT" + }, + "node_modules/json5": { + "version": "2.2.3", + "resolved": "https://registry.npmjs.org/json5/-/json5-2.2.3.tgz", + "integrity": "sha512-XmOWe7eyHYH14cLdVPoyg+GOH3rYX++KpzrylJwSW98t3Nk+U8XOl8FWKOgwtzdb8lXGf6zYwDUzeHMWfxasyg==", + "dev": true, + "license": "MIT", + "bin": { + "json5": "lib/cli.js" + }, + "engines": { + "node": ">=6" + } + }, + "node_modules/jsx-ast-utils": { + "version": "3.3.5", + "resolved": "https://registry.npmjs.org/jsx-ast-utils/-/jsx-ast-utils-3.3.5.tgz", + "integrity": "sha512-ZZow9HBI5O6EPgSJLUb8n2NKgmVWTwCvHGwFuJlMjvLFqlGG6pjirPhtdsseaLZjSibD8eegzmYpUZwoIlj2cQ==", + "dev": true, + "license": "MIT", + "dependencies": { + "array-includes": "^3.1.6", + "array.prototype.flat": "^1.3.1", + "object.assign": "^4.1.4", + "object.values": "^1.1.6" + }, + "engines": { + "node": ">=4.0" + } + }, + "node_modules/keyv": { + "version": "4.5.4", + "resolved": "https://registry.npmjs.org/keyv/-/keyv-4.5.4.tgz", + "integrity": "sha512-oxVHkHR/EJf2CNXnWxRLW6mg7JyCCUcG0DtEGmL2ctUo1PNTin1PUil+r/+4r5MpVgC/fn1kjsx7mjSujKqIpw==", + "dev": true, + "license": "MIT", + "dependencies": { + "json-buffer": "3.0.1" + } + }, + "node_modules/levn": { + "version": "0.4.1", + "resolved": "https://registry.npmjs.org/levn/-/levn-0.4.1.tgz", + "integrity": "sha512-+bT2uH4E5LGE7h/n3evcS/sQlJXCpIp6ym8OWJ5eV6+67Dsql/LaaT7qJBAt2rzfoa/5QBGBhxDix1dMt2kQKQ==", + "dev": true, + "license": "MIT", + "dependencies": { + "prelude-ls": "^1.2.1", + "type-check": "~0.4.0" + }, + "engines": { + "node": ">= 0.8.0" + } + }, + "node_modules/lightningcss": { + "version": "1.30.2", + "resolved": "https://registry.npmjs.org/lightningcss/-/lightningcss-1.30.2.tgz", + "integrity": "sha512-utfs7Pr5uJyyvDETitgsaqSyjCb2qNRAtuqUeWIAKztsOYdcACf2KtARYXg2pSvhkt+9NfoaNY7fxjl6nuMjIQ==", + "dev": true, + "license": "MPL-2.0", + "dependencies": { + "detect-libc": "^2.0.3" + }, + "engines": { + "node": ">= 12.0.0" + }, + "funding": { + "type": "opencollective", + "url": "https://opencollective.com/parcel" + }, + "optionalDependencies": { + "lightningcss-android-arm64": "1.30.2", + "lightningcss-darwin-arm64": "1.30.2", + "lightningcss-darwin-x64": "1.30.2", + "lightningcss-freebsd-x64": "1.30.2", + "lightningcss-linux-arm-gnueabihf": "1.30.2", + "lightningcss-linux-arm64-gnu": "1.30.2", + "lightningcss-linux-arm64-musl": "1.30.2", + "lightningcss-linux-x64-gnu": "1.30.2", + "lightningcss-linux-x64-musl": "1.30.2", + "lightningcss-win32-arm64-msvc": "1.30.2", + "lightningcss-win32-x64-msvc": "1.30.2" + } + }, + "node_modules/lightningcss-android-arm64": { + "version": "1.30.2", + "resolved": "https://registry.npmjs.org/lightningcss-android-arm64/-/lightningcss-android-arm64-1.30.2.tgz", + "integrity": "sha512-BH9sEdOCahSgmkVhBLeU7Hc9DWeZ1Eb6wNS6Da8igvUwAe0sqROHddIlvU06q3WyXVEOYDZ6ykBZQnjTbmo4+A==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MPL-2.0", + "optional": true, + "os": [ + "android" + ], + "engines": { + "node": ">= 12.0.0" + }, + "funding": { + "type": "opencollective", + "url": "https://opencollective.com/parcel" + } + }, + "node_modules/lightningcss-darwin-arm64": { + "version": "1.30.2", + "resolved": "https://registry.npmjs.org/lightningcss-darwin-arm64/-/lightningcss-darwin-arm64-1.30.2.tgz", + "integrity": "sha512-ylTcDJBN3Hp21TdhRT5zBOIi73P6/W0qwvlFEk22fkdXchtNTOU4Qc37SkzV+EKYxLouZ6M4LG9NfZ1qkhhBWA==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MPL-2.0", + "optional": true, + "os": [ + "darwin" + ], + "engines": { + "node": ">= 12.0.0" + }, + "funding": { + "type": "opencollective", + "url": "https://opencollective.com/parcel" + } + }, + "node_modules/lightningcss-darwin-x64": { + "version": "1.30.2", + "resolved": "https://registry.npmjs.org/lightningcss-darwin-x64/-/lightningcss-darwin-x64-1.30.2.tgz", + "integrity": "sha512-oBZgKchomuDYxr7ilwLcyms6BCyLn0z8J0+ZZmfpjwg9fRVZIR5/GMXd7r9RH94iDhld3UmSjBM6nXWM2TfZTQ==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MPL-2.0", + "optional": true, + "os": [ + "darwin" + ], + "engines": { + "node": ">= 12.0.0" + }, + "funding": { + "type": "opencollective", + "url": "https://opencollective.com/parcel" + } + }, + "node_modules/lightningcss-freebsd-x64": { + "version": "1.30.2", + "resolved": "https://registry.npmjs.org/lightningcss-freebsd-x64/-/lightningcss-freebsd-x64-1.30.2.tgz", + "integrity": "sha512-c2bH6xTrf4BDpK8MoGG4Bd6zAMZDAXS569UxCAGcA7IKbHNMlhGQ89eRmvpIUGfKWNVdbhSbkQaWhEoMGmGslA==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MPL-2.0", + "optional": true, + "os": [ + "freebsd" + ], + "engines": { + "node": ">= 12.0.0" + }, + "funding": { + "type": "opencollective", + "url": "https://opencollective.com/parcel" + } + }, + "node_modules/lightningcss-linux-arm-gnueabihf": { + "version": "1.30.2", + "resolved": "https://registry.npmjs.org/lightningcss-linux-arm-gnueabihf/-/lightningcss-linux-arm-gnueabihf-1.30.2.tgz", + "integrity": "sha512-eVdpxh4wYcm0PofJIZVuYuLiqBIakQ9uFZmipf6LF/HRj5Bgm0eb3qL/mr1smyXIS1twwOxNWndd8z0E374hiA==", + "cpu": [ + "arm" + ], + "dev": true, + "license": "MPL-2.0", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": ">= 12.0.0" + }, + "funding": { + "type": "opencollective", + "url": "https://opencollective.com/parcel" + } + }, + "node_modules/lightningcss-linux-arm64-gnu": { + "version": "1.30.2", + "resolved": "https://registry.npmjs.org/lightningcss-linux-arm64-gnu/-/lightningcss-linux-arm64-gnu-1.30.2.tgz", + "integrity": "sha512-UK65WJAbwIJbiBFXpxrbTNArtfuznvxAJw4Q2ZGlU8kPeDIWEX1dg3rn2veBVUylA2Ezg89ktszWbaQnxD/e3A==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MPL-2.0", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": ">= 12.0.0" + }, + "funding": { + "type": "opencollective", + "url": "https://opencollective.com/parcel" + } + }, + "node_modules/lightningcss-linux-arm64-musl": { + "version": "1.30.2", + "resolved": "https://registry.npmjs.org/lightningcss-linux-arm64-musl/-/lightningcss-linux-arm64-musl-1.30.2.tgz", + "integrity": "sha512-5Vh9dGeblpTxWHpOx8iauV02popZDsCYMPIgiuw97OJ5uaDsL86cnqSFs5LZkG3ghHoX5isLgWzMs+eD1YzrnA==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MPL-2.0", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": ">= 12.0.0" + }, + "funding": { + "type": "opencollective", + "url": "https://opencollective.com/parcel" + } + }, + "node_modules/lightningcss-linux-x64-gnu": { + "version": "1.30.2", + "resolved": "https://registry.npmjs.org/lightningcss-linux-x64-gnu/-/lightningcss-linux-x64-gnu-1.30.2.tgz", + "integrity": "sha512-Cfd46gdmj1vQ+lR6VRTTadNHu6ALuw2pKR9lYq4FnhvgBc4zWY1EtZcAc6EffShbb1MFrIPfLDXD6Xprbnni4w==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MPL-2.0", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": ">= 12.0.0" + }, + "funding": { + "type": "opencollective", + "url": "https://opencollective.com/parcel" + } + }, + "node_modules/lightningcss-linux-x64-musl": { + "version": "1.30.2", + "resolved": "https://registry.npmjs.org/lightningcss-linux-x64-musl/-/lightningcss-linux-x64-musl-1.30.2.tgz", + "integrity": "sha512-XJaLUUFXb6/QG2lGIW6aIk6jKdtjtcffUT0NKvIqhSBY3hh9Ch+1LCeH80dR9q9LBjG3ewbDjnumefsLsP6aiA==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MPL-2.0", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": ">= 12.0.0" + }, + "funding": { + "type": "opencollective", + "url": "https://opencollective.com/parcel" + } + }, + "node_modules/lightningcss-win32-arm64-msvc": { + "version": "1.30.2", + "resolved": "https://registry.npmjs.org/lightningcss-win32-arm64-msvc/-/lightningcss-win32-arm64-msvc-1.30.2.tgz", + "integrity": "sha512-FZn+vaj7zLv//D/192WFFVA0RgHawIcHqLX9xuWiQt7P0PtdFEVaxgF9rjM/IRYHQXNnk61/H/gb2Ei+kUQ4xQ==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MPL-2.0", + "optional": true, + "os": [ + "win32" + ], + "engines": { + "node": ">= 12.0.0" + }, + "funding": { + "type": "opencollective", + "url": "https://opencollective.com/parcel" + } + }, + "node_modules/lightningcss-win32-x64-msvc": { + "version": "1.30.2", + "resolved": "https://registry.npmjs.org/lightningcss-win32-x64-msvc/-/lightningcss-win32-x64-msvc-1.30.2.tgz", + "integrity": "sha512-5g1yc73p+iAkid5phb4oVFMB45417DkRevRbt/El/gKXJk4jid+vPFF/AXbxn05Aky8PapwzZrdJShv5C0avjw==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MPL-2.0", + "optional": true, + "os": [ + "win32" + ], + "engines": { + "node": ">= 12.0.0" + }, + "funding": { + "type": "opencollective", + "url": "https://opencollective.com/parcel" + } + }, + "node_modules/locate-path": { + "version": "6.0.0", + "resolved": "https://registry.npmjs.org/locate-path/-/locate-path-6.0.0.tgz", + "integrity": "sha512-iPZK6eYjbxRu3uB4/WZ3EsEIMJFMqAoopl3R+zuq0UjcAm/MO6KCweDgPfP3elTztoKP3KtnVHxTn2NHBSDVUw==", + "dev": true, + "license": "MIT", + "dependencies": { + "p-locate": "^5.0.0" + }, + "engines": { + "node": ">=10" + }, + "funding": { + "url": "https://github.com/sponsors/sindresorhus" + } + }, + "node_modules/lodash.merge": { + "version": "4.6.2", + "resolved": "https://registry.npmjs.org/lodash.merge/-/lodash.merge-4.6.2.tgz", + "integrity": "sha512-0KpjqXRVvrYyCsX1swR/XTK0va6VQkQM6MNo7PqW77ByjAhoARA8EfrP1N4+KlKj8YS0ZUCtRT/YUuhyYDujIQ==", + "dev": true, + "license": "MIT" + }, + "node_modules/longest-streak": { + "version": "3.1.0", + "resolved": "https://registry.npmjs.org/longest-streak/-/longest-streak-3.1.0.tgz", + "integrity": "sha512-9Ri+o0JYgehTaVBBDoMqIl8GXtbWg711O3srftcHhZ0dqnETqLaoIK0x17fUw9rFSlK/0NlsKe0Ahhyl5pXE2g==", + "license": "MIT", + "funding": { + "type": "github", + "url": "https://github.com/sponsors/wooorm" + } + }, + "node_modules/loose-envify": { + "version": "1.4.0", + "resolved": "https://registry.npmjs.org/loose-envify/-/loose-envify-1.4.0.tgz", + "integrity": "sha512-lyuxPGr/Wfhrlem2CL/UcnUc1zcqKAImBDzukY7Y5F/yQiNdko6+fRLevlw1HgMySw7f611UIY408EtxRSoK3Q==", + "license": "MIT", + "dependencies": { + "js-tokens": "^3.0.0 || ^4.0.0" + }, + "bin": { + "loose-envify": "cli.js" + } + }, + "node_modules/lru-cache": { + "version": "5.1.1", + "resolved": "https://registry.npmjs.org/lru-cache/-/lru-cache-5.1.1.tgz", + "integrity": "sha512-KpNARQA3Iwv+jTA0utUVVbrh+Jlrr1Fv0e56GGzAFOXN7dk/FviaDW8LHmK52DlcH4WP2n6gI8vN1aesBFgo9w==", + "dev": true, + "license": "ISC", + "dependencies": { + "yallist": "^3.0.2" + } + }, + "node_modules/magic-string": { + "version": "0.30.21", + "resolved": "https://registry.npmjs.org/magic-string/-/magic-string-0.30.21.tgz", + "integrity": "sha512-vd2F4YUyEXKGcLHoq+TEyCjxueSeHnFxyyjNp80yg0XV4vUhnDer/lvvlqM/arB5bXQN5K2/3oinyCRyx8T2CQ==", + "dev": true, + "license": "MIT", + "dependencies": { + "@jridgewell/sourcemap-codec": "^1.5.5" + } + }, + "node_modules/math-intrinsics": { + "version": "1.1.0", + "resolved": "https://registry.npmjs.org/math-intrinsics/-/math-intrinsics-1.1.0.tgz", + "integrity": "sha512-/IXtbwEk5HTPyEwyKX6hGkYXxM9nbj64B+ilVJnC/R6B0pH5G4V3b0pVbL7DBj4tkhBAppbQUlf6F6Xl9LHu1g==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">= 0.4" + } + }, + "node_modules/mdast-util-from-markdown": { + "version": "2.0.2", + "resolved": "https://registry.npmjs.org/mdast-util-from-markdown/-/mdast-util-from-markdown-2.0.2.tgz", + "integrity": "sha512-uZhTV/8NBuw0WHkPTrCqDOl0zVe1BIng5ZtHoDk49ME1qqcjYmmLmOf0gELgcRMxN4w2iuIeVso5/6QymSrgmA==", + "license": "MIT", + "dependencies": { + "@types/mdast": "^4.0.0", + "@types/unist": "^3.0.0", + "decode-named-character-reference": "^1.0.0", + "devlop": "^1.0.0", + "mdast-util-to-string": "^4.0.0", + "micromark": "^4.0.0", + "micromark-util-decode-numeric-character-reference": "^2.0.0", + "micromark-util-decode-string": "^2.0.0", + "micromark-util-normalize-identifier": "^2.0.0", + "micromark-util-symbol": "^2.0.0", + "micromark-util-types": "^2.0.0", + "unist-util-stringify-position": "^4.0.0" + }, + "funding": { + "type": "opencollective", + "url": "https://opencollective.com/unified" + } + }, + "node_modules/mdast-util-mdx-expression": { + "version": "2.0.1", + "resolved": "https://registry.npmjs.org/mdast-util-mdx-expression/-/mdast-util-mdx-expression-2.0.1.tgz", + "integrity": "sha512-J6f+9hUp+ldTZqKRSg7Vw5V6MqjATc+3E4gf3CFNcuZNWD8XdyI6zQ8GqH7f8169MM6P7hMBRDVGnn7oHB9kXQ==", + "license": "MIT", + "dependencies": { + "@types/estree-jsx": "^1.0.0", + "@types/hast": "^3.0.0", + "@types/mdast": "^4.0.0", + "devlop": "^1.0.0", + "mdast-util-from-markdown": "^2.0.0", + "mdast-util-to-markdown": "^2.0.0" + }, + "funding": { + "type": "opencollective", + "url": "https://opencollective.com/unified" + } + }, + "node_modules/mdast-util-mdx-jsx": { + "version": "3.2.0", + "resolved": "https://registry.npmjs.org/mdast-util-mdx-jsx/-/mdast-util-mdx-jsx-3.2.0.tgz", + "integrity": "sha512-lj/z8v0r6ZtsN/cGNNtemmmfoLAFZnjMbNyLzBafjzikOM+glrjNHPlf6lQDOTccj9n5b0PPihEBbhneMyGs1Q==", + "license": "MIT", + "dependencies": { + "@types/estree-jsx": "^1.0.0", + "@types/hast": "^3.0.0", + "@types/mdast": "^4.0.0", + "@types/unist": "^3.0.0", + "ccount": "^2.0.0", + "devlop": "^1.1.0", + "mdast-util-from-markdown": "^2.0.0", + "mdast-util-to-markdown": "^2.0.0", + "parse-entities": "^4.0.0", + "stringify-entities": "^4.0.0", + "unist-util-stringify-position": "^4.0.0", + "vfile-message": "^4.0.0" + }, + "funding": { + "type": "opencollective", + "url": "https://opencollective.com/unified" + } + }, + "node_modules/mdast-util-mdxjs-esm": { + "version": "2.0.1", + "resolved": "https://registry.npmjs.org/mdast-util-mdxjs-esm/-/mdast-util-mdxjs-esm-2.0.1.tgz", + "integrity": "sha512-EcmOpxsZ96CvlP03NghtH1EsLtr0n9Tm4lPUJUBccV9RwUOneqSycg19n5HGzCf+10LozMRSObtVr3ee1WoHtg==", + "license": "MIT", + "dependencies": { + "@types/estree-jsx": "^1.0.0", + "@types/hast": "^3.0.0", + "@types/mdast": "^4.0.0", + "devlop": "^1.0.0", + "mdast-util-from-markdown": "^2.0.0", + "mdast-util-to-markdown": "^2.0.0" + }, + "funding": { + "type": "opencollective", + "url": "https://opencollective.com/unified" + } + }, + "node_modules/mdast-util-phrasing": { + "version": "4.1.0", + "resolved": "https://registry.npmjs.org/mdast-util-phrasing/-/mdast-util-phrasing-4.1.0.tgz", + "integrity": "sha512-TqICwyvJJpBwvGAMZjj4J2n0X8QWp21b9l0o7eXyVJ25YNWYbJDVIyD1bZXE6WtV6RmKJVYmQAKWa0zWOABz2w==", + "license": "MIT", + "dependencies": { + "@types/mdast": "^4.0.0", + "unist-util-is": "^6.0.0" + }, + "funding": { + "type": "opencollective", + "url": "https://opencollective.com/unified" + } + }, + "node_modules/mdast-util-to-hast": { + "version": "13.2.1", + "resolved": "https://registry.npmjs.org/mdast-util-to-hast/-/mdast-util-to-hast-13.2.1.tgz", + "integrity": "sha512-cctsq2wp5vTsLIcaymblUriiTcZd0CwWtCbLvrOzYCDZoWyMNV8sZ7krj09FSnsiJi3WVsHLM4k6Dq/yaPyCXA==", + "license": "MIT", + "dependencies": { + "@types/hast": "^3.0.0", + "@types/mdast": "^4.0.0", + "@ungap/structured-clone": "^1.0.0", + "devlop": "^1.0.0", + "micromark-util-sanitize-uri": "^2.0.0", + "trim-lines": "^3.0.0", + "unist-util-position": "^5.0.0", + "unist-util-visit": "^5.0.0", + "vfile": "^6.0.0" + }, + "funding": { + "type": "opencollective", + "url": "https://opencollective.com/unified" + } + }, + "node_modules/mdast-util-to-markdown": { + "version": "2.1.2", + "resolved": "https://registry.npmjs.org/mdast-util-to-markdown/-/mdast-util-to-markdown-2.1.2.tgz", + "integrity": "sha512-xj68wMTvGXVOKonmog6LwyJKrYXZPvlwabaryTjLh9LuvovB/KAH+kvi8Gjj+7rJjsFi23nkUxRQv1KqSroMqA==", + "license": "MIT", + "dependencies": { + "@types/mdast": "^4.0.0", + "@types/unist": "^3.0.0", + "longest-streak": "^3.0.0", + "mdast-util-phrasing": "^4.0.0", + "mdast-util-to-string": "^4.0.0", + "micromark-util-classify-character": "^2.0.0", + "micromark-util-decode-string": "^2.0.0", + "unist-util-visit": "^5.0.0", + "zwitch": "^2.0.0" + }, + "funding": { + "type": "opencollective", + "url": "https://opencollective.com/unified" + } + }, + "node_modules/mdast-util-to-string": { + "version": "4.0.0", + "resolved": "https://registry.npmjs.org/mdast-util-to-string/-/mdast-util-to-string-4.0.0.tgz", + "integrity": "sha512-0H44vDimn51F0YwvxSJSm0eCDOJTRlmN0R1yBh4HLj9wiV1Dn0QoXGbvFAWj2hSItVTlCmBF1hqKlIyUBVFLPg==", + "license": "MIT", + "dependencies": { + "@types/mdast": "^4.0.0" + }, + "funding": { + "type": "opencollective", + "url": "https://opencollective.com/unified" + } + }, + "node_modules/micromark": { + "version": "4.0.2", + "resolved": "https://registry.npmjs.org/micromark/-/micromark-4.0.2.tgz", + "integrity": "sha512-zpe98Q6kvavpCr1NPVSCMebCKfD7CA2NqZ+rykeNhONIJBpc1tFKt9hucLGwha3jNTNI8lHpctWJWoimVF4PfA==", + "funding": [ + { + "type": "GitHub Sponsors", + "url": "https://github.com/sponsors/unifiedjs" + }, + { + "type": "OpenCollective", + "url": "https://opencollective.com/unified" + } + ], + "license": "MIT", + "dependencies": { + "@types/debug": "^4.0.0", + "debug": "^4.0.0", + "decode-named-character-reference": "^1.0.0", + "devlop": "^1.0.0", + "micromark-core-commonmark": "^2.0.0", + "micromark-factory-space": "^2.0.0", + "micromark-util-character": "^2.0.0", + "micromark-util-chunked": "^2.0.0", + "micromark-util-combine-extensions": "^2.0.0", + "micromark-util-decode-numeric-character-reference": "^2.0.0", + "micromark-util-encode": "^2.0.0", + "micromark-util-normalize-identifier": "^2.0.0", + "micromark-util-resolve-all": "^2.0.0", + "micromark-util-sanitize-uri": "^2.0.0", + "micromark-util-subtokenize": "^2.0.0", + "micromark-util-symbol": "^2.0.0", + "micromark-util-types": "^2.0.0" + } + }, + "node_modules/micromark-core-commonmark": { + "version": "2.0.3", + "resolved": "https://registry.npmjs.org/micromark-core-commonmark/-/micromark-core-commonmark-2.0.3.tgz", + "integrity": "sha512-RDBrHEMSxVFLg6xvnXmb1Ayr2WzLAWjeSATAoxwKYJV94TeNavgoIdA0a9ytzDSVzBy2YKFK+emCPOEibLeCrg==", + "funding": [ + { + "type": "GitHub Sponsors", + "url": "https://github.com/sponsors/unifiedjs" + }, + { + "type": "OpenCollective", + "url": "https://opencollective.com/unified" + } + ], + "license": "MIT", + "dependencies": { + "decode-named-character-reference": "^1.0.0", + "devlop": "^1.0.0", + "micromark-factory-destination": "^2.0.0", + "micromark-factory-label": "^2.0.0", + "micromark-factory-space": "^2.0.0", + "micromark-factory-title": "^2.0.0", + "micromark-factory-whitespace": "^2.0.0", + "micromark-util-character": "^2.0.0", + "micromark-util-chunked": "^2.0.0", + "micromark-util-classify-character": "^2.0.0", + "micromark-util-html-tag-name": "^2.0.0", + "micromark-util-normalize-identifier": "^2.0.0", + "micromark-util-resolve-all": "^2.0.0", + "micromark-util-subtokenize": "^2.0.0", + "micromark-util-symbol": "^2.0.0", + "micromark-util-types": "^2.0.0" + } + }, + "node_modules/micromark-factory-destination": { + "version": "2.0.1", + "resolved": "https://registry.npmjs.org/micromark-factory-destination/-/micromark-factory-destination-2.0.1.tgz", + "integrity": "sha512-Xe6rDdJlkmbFRExpTOmRj9N3MaWmbAgdpSrBQvCFqhezUn4AHqJHbaEnfbVYYiexVSs//tqOdY/DxhjdCiJnIA==", + "funding": [ + { + "type": "GitHub Sponsors", + "url": "https://github.com/sponsors/unifiedjs" + }, + { + "type": "OpenCollective", + "url": "https://opencollective.com/unified" + } + ], + "license": "MIT", + "dependencies": { + "micromark-util-character": "^2.0.0", + "micromark-util-symbol": "^2.0.0", + "micromark-util-types": "^2.0.0" + } + }, + "node_modules/micromark-factory-label": { + "version": "2.0.1", + "resolved": "https://registry.npmjs.org/micromark-factory-label/-/micromark-factory-label-2.0.1.tgz", + "integrity": "sha512-VFMekyQExqIW7xIChcXn4ok29YE3rnuyveW3wZQWWqF4Nv9Wk5rgJ99KzPvHjkmPXF93FXIbBp6YdW3t71/7Vg==", + "funding": [ + { + "type": "GitHub Sponsors", + "url": "https://github.com/sponsors/unifiedjs" + }, + { + "type": "OpenCollective", + "url": "https://opencollective.com/unified" + } + ], + "license": "MIT", + "dependencies": { + "devlop": "^1.0.0", + "micromark-util-character": "^2.0.0", + "micromark-util-symbol": "^2.0.0", + "micromark-util-types": "^2.0.0" + } + }, + "node_modules/micromark-factory-space": { + "version": "2.0.1", + "resolved": "https://registry.npmjs.org/micromark-factory-space/-/micromark-factory-space-2.0.1.tgz", + "integrity": "sha512-zRkxjtBxxLd2Sc0d+fbnEunsTj46SWXgXciZmHq0kDYGnck/ZSGj9/wULTV95uoeYiK5hRXP2mJ98Uo4cq/LQg==", + "funding": [ + { + "type": "GitHub Sponsors", + "url": "https://github.com/sponsors/unifiedjs" + }, + { + "type": "OpenCollective", + "url": "https://opencollective.com/unified" + } + ], + "license": "MIT", + "dependencies": { + "micromark-util-character": "^2.0.0", + "micromark-util-types": "^2.0.0" + } + }, + "node_modules/micromark-factory-title": { + "version": "2.0.1", + "resolved": "https://registry.npmjs.org/micromark-factory-title/-/micromark-factory-title-2.0.1.tgz", + "integrity": "sha512-5bZ+3CjhAd9eChYTHsjy6TGxpOFSKgKKJPJxr293jTbfry2KDoWkhBb6TcPVB4NmzaPhMs1Frm9AZH7OD4Cjzw==", + "funding": [ + { + "type": "GitHub Sponsors", + "url": "https://github.com/sponsors/unifiedjs" + }, + { + "type": "OpenCollective", + "url": "https://opencollective.com/unified" + } + ], + "license": "MIT", + "dependencies": { + "micromark-factory-space": "^2.0.0", + "micromark-util-character": "^2.0.0", + "micromark-util-symbol": "^2.0.0", + "micromark-util-types": "^2.0.0" + } + }, + "node_modules/micromark-factory-whitespace": { + "version": "2.0.1", + "resolved": "https://registry.npmjs.org/micromark-factory-whitespace/-/micromark-factory-whitespace-2.0.1.tgz", + "integrity": "sha512-Ob0nuZ3PKt/n0hORHyvoD9uZhr+Za8sFoP+OnMcnWK5lngSzALgQYKMr9RJVOWLqQYuyn6ulqGWSXdwf6F80lQ==", + "funding": [ + { + "type": "GitHub Sponsors", + "url": "https://github.com/sponsors/unifiedjs" + }, + { + "type": "OpenCollective", + "url": "https://opencollective.com/unified" + } + ], + "license": "MIT", + "dependencies": { + "micromark-factory-space": "^2.0.0", + "micromark-util-character": "^2.0.0", + "micromark-util-symbol": "^2.0.0", + "micromark-util-types": "^2.0.0" + } + }, + "node_modules/micromark-util-character": { + "version": "2.1.1", + "resolved": "https://registry.npmjs.org/micromark-util-character/-/micromark-util-character-2.1.1.tgz", + "integrity": "sha512-wv8tdUTJ3thSFFFJKtpYKOYiGP2+v96Hvk4Tu8KpCAsTMs6yi+nVmGh1syvSCsaxz45J6Jbw+9DD6g97+NV67Q==", + "funding": [ + { + "type": "GitHub Sponsors", + "url": "https://github.com/sponsors/unifiedjs" + }, + { + "type": "OpenCollective", + "url": "https://opencollective.com/unified" + } + ], + "license": "MIT", + "dependencies": { + "micromark-util-symbol": "^2.0.0", + "micromark-util-types": "^2.0.0" + } + }, + "node_modules/micromark-util-chunked": { + "version": "2.0.1", + "resolved": "https://registry.npmjs.org/micromark-util-chunked/-/micromark-util-chunked-2.0.1.tgz", + "integrity": "sha512-QUNFEOPELfmvv+4xiNg2sRYeS/P84pTW0TCgP5zc9FpXetHY0ab7SxKyAQCNCc1eK0459uoLI1y5oO5Vc1dbhA==", + "funding": [ + { + "type": "GitHub Sponsors", + "url": "https://github.com/sponsors/unifiedjs" + }, + { + "type": "OpenCollective", + "url": "https://opencollective.com/unified" + } + ], + "license": "MIT", + "dependencies": { + "micromark-util-symbol": "^2.0.0" + } + }, + "node_modules/micromark-util-classify-character": { + "version": "2.0.1", + "resolved": "https://registry.npmjs.org/micromark-util-classify-character/-/micromark-util-classify-character-2.0.1.tgz", + "integrity": "sha512-K0kHzM6afW/MbeWYWLjoHQv1sgg2Q9EccHEDzSkxiP/EaagNzCm7T/WMKZ3rjMbvIpvBiZgwR3dKMygtA4mG1Q==", + "funding": [ + { + "type": "GitHub Sponsors", + "url": "https://github.com/sponsors/unifiedjs" + }, + { + "type": "OpenCollective", + "url": "https://opencollective.com/unified" + } + ], + "license": "MIT", + "dependencies": { + "micromark-util-character": "^2.0.0", + "micromark-util-symbol": "^2.0.0", + "micromark-util-types": "^2.0.0" + } + }, + "node_modules/micromark-util-combine-extensions": { + "version": "2.0.1", + "resolved": "https://registry.npmjs.org/micromark-util-combine-extensions/-/micromark-util-combine-extensions-2.0.1.tgz", + "integrity": "sha512-OnAnH8Ujmy59JcyZw8JSbK9cGpdVY44NKgSM7E9Eh7DiLS2E9RNQf0dONaGDzEG9yjEl5hcqeIsj4hfRkLH/Bg==", + "funding": [ + { + "type": "GitHub Sponsors", + "url": "https://github.com/sponsors/unifiedjs" + }, + { + "type": "OpenCollective", + "url": "https://opencollective.com/unified" + } + ], + "license": "MIT", + "dependencies": { + "micromark-util-chunked": "^2.0.0", + "micromark-util-types": "^2.0.0" + } + }, + "node_modules/micromark-util-decode-numeric-character-reference": { + "version": "2.0.2", + "resolved": "https://registry.npmjs.org/micromark-util-decode-numeric-character-reference/-/micromark-util-decode-numeric-character-reference-2.0.2.tgz", + "integrity": "sha512-ccUbYk6CwVdkmCQMyr64dXz42EfHGkPQlBj5p7YVGzq8I7CtjXZJrubAYezf7Rp+bjPseiROqe7G6foFd+lEuw==", + "funding": [ + { + "type": "GitHub Sponsors", + "url": "https://github.com/sponsors/unifiedjs" + }, + { + "type": "OpenCollective", + "url": "https://opencollective.com/unified" + } + ], + "license": "MIT", + "dependencies": { + "micromark-util-symbol": "^2.0.0" + } + }, + "node_modules/micromark-util-decode-string": { + "version": "2.0.1", + "resolved": "https://registry.npmjs.org/micromark-util-decode-string/-/micromark-util-decode-string-2.0.1.tgz", + "integrity": "sha512-nDV/77Fj6eH1ynwscYTOsbK7rR//Uj0bZXBwJZRfaLEJ1iGBR6kIfNmlNqaqJf649EP0F3NWNdeJi03elllNUQ==", + "funding": [ + { + "type": "GitHub Sponsors", + "url": "https://github.com/sponsors/unifiedjs" + }, + { + "type": "OpenCollective", + "url": "https://opencollective.com/unified" + } + ], + "license": "MIT", + "dependencies": { + "decode-named-character-reference": "^1.0.0", + "micromark-util-character": "^2.0.0", + "micromark-util-decode-numeric-character-reference": "^2.0.0", + "micromark-util-symbol": "^2.0.0" + } + }, + "node_modules/micromark-util-encode": { + "version": "2.0.1", + "resolved": "https://registry.npmjs.org/micromark-util-encode/-/micromark-util-encode-2.0.1.tgz", + "integrity": "sha512-c3cVx2y4KqUnwopcO9b/SCdo2O67LwJJ/UyqGfbigahfegL9myoEFoDYZgkT7f36T0bLrM9hZTAaAyH+PCAXjw==", + "funding": [ + { + "type": "GitHub Sponsors", + "url": "https://github.com/sponsors/unifiedjs" + }, + { + "type": "OpenCollective", + "url": "https://opencollective.com/unified" + } + ], + "license": "MIT" + }, + "node_modules/micromark-util-html-tag-name": { + "version": "2.0.1", + "resolved": "https://registry.npmjs.org/micromark-util-html-tag-name/-/micromark-util-html-tag-name-2.0.1.tgz", + "integrity": "sha512-2cNEiYDhCWKI+Gs9T0Tiysk136SnR13hhO8yW6BGNyhOC4qYFnwF1nKfD3HFAIXA5c45RrIG1ub11GiXeYd1xA==", + "funding": [ + { + "type": "GitHub Sponsors", + "url": "https://github.com/sponsors/unifiedjs" + }, + { + "type": "OpenCollective", + "url": "https://opencollective.com/unified" + } + ], + "license": "MIT" + }, + "node_modules/micromark-util-normalize-identifier": { + "version": "2.0.1", + "resolved": "https://registry.npmjs.org/micromark-util-normalize-identifier/-/micromark-util-normalize-identifier-2.0.1.tgz", + "integrity": "sha512-sxPqmo70LyARJs0w2UclACPUUEqltCkJ6PhKdMIDuJ3gSf/Q+/GIe3WKl0Ijb/GyH9lOpUkRAO2wp0GVkLvS9Q==", + "funding": [ + { + "type": "GitHub Sponsors", + "url": "https://github.com/sponsors/unifiedjs" + }, + { + "type": "OpenCollective", + "url": "https://opencollective.com/unified" + } + ], + "license": "MIT", + "dependencies": { + "micromark-util-symbol": "^2.0.0" + } + }, + "node_modules/micromark-util-resolve-all": { + "version": "2.0.1", + "resolved": "https://registry.npmjs.org/micromark-util-resolve-all/-/micromark-util-resolve-all-2.0.1.tgz", + "integrity": "sha512-VdQyxFWFT2/FGJgwQnJYbe1jjQoNTS4RjglmSjTUlpUMa95Htx9NHeYW4rGDJzbjvCsl9eLjMQwGeElsqmzcHg==", + "funding": [ + { + "type": "GitHub Sponsors", + "url": "https://github.com/sponsors/unifiedjs" + }, + { + "type": "OpenCollective", + "url": "https://opencollective.com/unified" + } + ], + "license": "MIT", + "dependencies": { + "micromark-util-types": "^2.0.0" + } + }, + "node_modules/micromark-util-sanitize-uri": { + "version": "2.0.1", + "resolved": "https://registry.npmjs.org/micromark-util-sanitize-uri/-/micromark-util-sanitize-uri-2.0.1.tgz", + "integrity": "sha512-9N9IomZ/YuGGZZmQec1MbgxtlgougxTodVwDzzEouPKo3qFWvymFHWcnDi2vzV1ff6kas9ucW+o3yzJK9YB1AQ==", + "funding": [ + { + "type": "GitHub Sponsors", + "url": "https://github.com/sponsors/unifiedjs" + }, + { + "type": "OpenCollective", + "url": "https://opencollective.com/unified" + } + ], + "license": "MIT", + "dependencies": { + "micromark-util-character": "^2.0.0", + "micromark-util-encode": "^2.0.0", + "micromark-util-symbol": "^2.0.0" + } + }, + "node_modules/micromark-util-subtokenize": { + "version": "2.1.0", + "resolved": "https://registry.npmjs.org/micromark-util-subtokenize/-/micromark-util-subtokenize-2.1.0.tgz", + "integrity": "sha512-XQLu552iSctvnEcgXw6+Sx75GflAPNED1qx7eBJ+wydBb2KCbRZe+NwvIEEMM83uml1+2WSXpBAcp9IUCgCYWA==", + "funding": [ + { + "type": "GitHub Sponsors", + "url": "https://github.com/sponsors/unifiedjs" + }, + { + "type": "OpenCollective", + "url": "https://opencollective.com/unified" + } + ], + "license": "MIT", + "dependencies": { + "devlop": "^1.0.0", + "micromark-util-chunked": "^2.0.0", + "micromark-util-symbol": "^2.0.0", + "micromark-util-types": "^2.0.0" + } + }, + "node_modules/micromark-util-symbol": { + "version": "2.0.1", + "resolved": "https://registry.npmjs.org/micromark-util-symbol/-/micromark-util-symbol-2.0.1.tgz", + "integrity": "sha512-vs5t8Apaud9N28kgCrRUdEed4UJ+wWNvicHLPxCa9ENlYuAY31M0ETy5y1vA33YoNPDFTghEbnh6efaE8h4x0Q==", + "funding": [ + { + "type": "GitHub Sponsors", + "url": "https://github.com/sponsors/unifiedjs" + }, + { + "type": "OpenCollective", + "url": "https://opencollective.com/unified" + } + ], + "license": "MIT" + }, + "node_modules/micromark-util-types": { + "version": "2.0.2", + "resolved": "https://registry.npmjs.org/micromark-util-types/-/micromark-util-types-2.0.2.tgz", + "integrity": "sha512-Yw0ECSpJoViF1qTU4DC6NwtC4aWGt1EkzaQB8KPPyCRR8z9TWeV0HbEFGTO+ZY1wB22zmxnJqhPyTpOVCpeHTA==", + "funding": [ + { + "type": "GitHub Sponsors", + "url": "https://github.com/sponsors/unifiedjs" + }, + { + "type": "OpenCollective", + "url": "https://opencollective.com/unified" + } + ], + "license": "MIT" + }, + "node_modules/minimatch": { + "version": "3.1.2", + "resolved": "https://registry.npmjs.org/minimatch/-/minimatch-3.1.2.tgz", + "integrity": "sha512-J7p63hRiAjw1NDEww1W7i37+ByIrOWO5XQQAzZ3VOcL0PNybwpfmV/N05zFAzwQ9USyEcX6t3UO+K5aqBQOIHw==", + "dev": true, + "license": "ISC", + "dependencies": { + "brace-expansion": "^1.1.7" + }, + "engines": { + "node": "*" + } + }, + "node_modules/ms": { + "version": "2.1.3", + "resolved": "https://registry.npmjs.org/ms/-/ms-2.1.3.tgz", + "integrity": "sha512-6FlzubTLZG3J2a/NVCAleEhjzq5oxgHyaCU9yYXvcLsvoVaHJq/s5xXI6/XXP6tz7R9xAOtHnSO/tXtF3WRTlA==", + "license": "MIT" + }, + "node_modules/nanoid": { + "version": "3.3.11", + "resolved": "https://registry.npmjs.org/nanoid/-/nanoid-3.3.11.tgz", + "integrity": "sha512-N8SpfPUnUp1bK+PMYW8qSWdl9U+wwNWI4QKxOYDy9JAro3WMX7p2OeVRF9v+347pnakNevPmiHhNmZ2HbFA76w==", + "dev": true, + "funding": [ + { + "type": "github", + "url": "https://github.com/sponsors/ai" + } + ], + "license": "MIT", + "bin": { + "nanoid": "bin/nanoid.cjs" + }, + "engines": { + "node": "^10 || ^12 || ^13.7 || ^14 || >=15.0.1" + } + }, + "node_modules/natural-compare": { + "version": "1.4.0", + "resolved": "https://registry.npmjs.org/natural-compare/-/natural-compare-1.4.0.tgz", + "integrity": "sha512-OWND8ei3VtNC9h7V60qff3SVobHr996CTwgxubgyQYEpg290h9J0buyECNNJexkFm5sOajh5G116RYA1c8ZMSw==", + "dev": true, + "license": "MIT" + }, + "node_modules/node-releases": { + "version": "2.0.27", + "resolved": "https://registry.npmjs.org/node-releases/-/node-releases-2.0.27.tgz", + "integrity": "sha512-nmh3lCkYZ3grZvqcCH+fjmQ7X+H0OeZgP40OierEaAptX4XofMh5kwNbWh7lBduUzCcV/8kZ+NDLCwm2iorIlA==", + "dev": true, + "license": "MIT" + }, + "node_modules/object-assign": { + "version": "4.1.1", + "resolved": "https://registry.npmjs.org/object-assign/-/object-assign-4.1.1.tgz", + "integrity": "sha512-rJgTQnkUnH1sFw8yT6VSU3zD3sWmu6sZhIseY8VX+GRu3P6F7Fu+JNDoXfklElbLJSnc3FUQHVe4cU5hj+BcUg==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">=0.10.0" + } + }, + "node_modules/object-inspect": { + "version": "1.13.4", + "resolved": "https://registry.npmjs.org/object-inspect/-/object-inspect-1.13.4.tgz", + "integrity": "sha512-W67iLl4J2EXEGTbfeHCffrjDfitvLANg0UlX3wFUUSTx92KXRFegMHUVgSqE+wvhAbi4WqjGg9czysTV2Epbew==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/object-keys": { + "version": "1.1.1", + "resolved": "https://registry.npmjs.org/object-keys/-/object-keys-1.1.1.tgz", + "integrity": "sha512-NuAESUOUMrlIXOfHKzD6bpPu3tYt3xvjNdRIQ+FeT0lNb4K8WR70CaDxhuNguS2XG+GjkyMwOzsN5ZktImfhLA==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">= 0.4" + } + }, + "node_modules/object.assign": { + "version": "4.1.7", + "resolved": "https://registry.npmjs.org/object.assign/-/object.assign-4.1.7.tgz", + "integrity": "sha512-nK28WOo+QIjBkDduTINE4JkF/UJJKyf2EJxvJKfblDpyg0Q+pkOHNTL0Qwy6NP6FhE/EnzV73BxxqcJaXY9anw==", + "dev": true, + "license": "MIT", + "dependencies": { + "call-bind": "^1.0.8", + "call-bound": "^1.0.3", + "define-properties": "^1.2.1", + "es-object-atoms": "^1.0.0", + "has-symbols": "^1.1.0", + "object-keys": "^1.1.1" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/object.entries": { + "version": "1.1.9", + "resolved": "https://registry.npmjs.org/object.entries/-/object.entries-1.1.9.tgz", + "integrity": "sha512-8u/hfXFRBD1O0hPUjioLhoWFHRmt6tKA4/vZPyckBr18l1KE9uHrFaFaUi8MDRTpi4uak2goyPTSNJLXX2k2Hw==", + "dev": true, + "license": "MIT", + "dependencies": { + "call-bind": "^1.0.8", + "call-bound": "^1.0.4", + "define-properties": "^1.2.1", + "es-object-atoms": "^1.1.1" + }, + "engines": { + "node": ">= 0.4" + } + }, + "node_modules/object.fromentries": { + "version": "2.0.8", + "resolved": "https://registry.npmjs.org/object.fromentries/-/object.fromentries-2.0.8.tgz", + "integrity": "sha512-k6E21FzySsSK5a21KRADBd/NGneRegFO5pLHfdQLpRDETUNJueLXs3WCzyQ3tFRDYgbq3KHGXfTbi2bs8WQ6rQ==", + "dev": true, + "license": "MIT", + "dependencies": { + "call-bind": "^1.0.7", + "define-properties": "^1.2.1", + "es-abstract": "^1.23.2", + "es-object-atoms": "^1.0.0" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/object.values": { + "version": "1.2.1", + "resolved": "https://registry.npmjs.org/object.values/-/object.values-1.2.1.tgz", + "integrity": "sha512-gXah6aZrcUxjWg2zR2MwouP2eHlCBzdV4pygudehaKXSGW4v2AsRQUK+lwwXhii6KFZcunEnmSUoYp5CXibxtA==", + "dev": true, + "license": "MIT", + "dependencies": { + "call-bind": "^1.0.8", + "call-bound": "^1.0.3", + "define-properties": "^1.2.1", + "es-object-atoms": "^1.0.0" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/optionator": { + "version": "0.9.4", + "resolved": "https://registry.npmjs.org/optionator/-/optionator-0.9.4.tgz", + "integrity": "sha512-6IpQ7mKUxRcZNLIObR0hz7lxsapSSIYNZJwXPGeF0mTVqGKFIXj1DQcMoT22S3ROcLyY/rz0PWaWZ9ayWmad9g==", + "dev": true, + "license": "MIT", + "dependencies": { + "deep-is": "^0.1.3", + "fast-levenshtein": "^2.0.6", + "levn": "^0.4.1", + "prelude-ls": "^1.2.1", + "type-check": "^0.4.0", + "word-wrap": "^1.2.5" + }, + "engines": { + "node": ">= 0.8.0" + } + }, + "node_modules/own-keys": { + "version": "1.0.1", + "resolved": "https://registry.npmjs.org/own-keys/-/own-keys-1.0.1.tgz", + "integrity": "sha512-qFOyK5PjiWZd+QQIh+1jhdb9LpxTF0qs7Pm8o5QHYZ0M3vKqSqzsZaEB6oWlxZ+q2sJBMI/Ktgd2N5ZwQoRHfg==", + "dev": true, + "license": "MIT", + "dependencies": { + "get-intrinsic": "^1.2.6", + "object-keys": "^1.1.1", + "safe-push-apply": "^1.0.0" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/p-limit": { + "version": "3.1.0", + "resolved": "https://registry.npmjs.org/p-limit/-/p-limit-3.1.0.tgz", + "integrity": "sha512-TYOanM3wGwNGsZN2cVTYPArw454xnXj5qmWF1bEoAc4+cU/ol7GVh7odevjp1FNHduHc3KZMcFduxU5Xc6uJRQ==", + "dev": true, + "license": "MIT", + "dependencies": { + "yocto-queue": "^0.1.0" + }, + "engines": { + "node": ">=10" + }, + "funding": { + "url": "https://github.com/sponsors/sindresorhus" + } + }, + "node_modules/p-locate": { + "version": "5.0.0", + "resolved": "https://registry.npmjs.org/p-locate/-/p-locate-5.0.0.tgz", + "integrity": "sha512-LaNjtRWUBY++zB5nE/NwcaoMylSPk+S+ZHNB1TzdbMJMny6dynpAGt7X/tl/QYq3TIeE6nxHppbo2LGymrG5Pw==", + "dev": true, + "license": "MIT", + "dependencies": { + "p-limit": "^3.0.2" + }, + "engines": { + "node": ">=10" + }, + "funding": { + "url": "https://github.com/sponsors/sindresorhus" + } + }, + "node_modules/parent-module": { + "version": "1.0.1", + "resolved": "https://registry.npmjs.org/parent-module/-/parent-module-1.0.1.tgz", + "integrity": "sha512-GQ2EWRpQV8/o+Aw8YqtfZZPfNRWZYkbidE9k5rpl/hC3vtHHBfGm2Ifi6qWV+coDGkrUKZAxE3Lot5kcsRlh+g==", + "dev": true, + "license": "MIT", + "dependencies": { + "callsites": "^3.0.0" + }, + "engines": { + "node": ">=6" + } + }, + "node_modules/parse-entities": { + "version": "4.0.2", + "resolved": "https://registry.npmjs.org/parse-entities/-/parse-entities-4.0.2.tgz", + "integrity": "sha512-GG2AQYWoLgL877gQIKeRPGO1xF9+eG1ujIb5soS5gPvLQ1y2o8FL90w2QWNdf9I361Mpp7726c+lj3U0qK1uGw==", + "license": "MIT", + "dependencies": { + "@types/unist": "^2.0.0", + "character-entities-legacy": "^3.0.0", + "character-reference-invalid": "^2.0.0", + "decode-named-character-reference": "^1.0.0", + "is-alphanumerical": "^2.0.0", + "is-decimal": "^2.0.0", + "is-hexadecimal": "^2.0.0" + }, + "funding": { + "type": "github", + "url": "https://github.com/sponsors/wooorm" + } + }, + "node_modules/parse-entities/node_modules/@types/unist": { + "version": "2.0.11", + "resolved": "https://registry.npmjs.org/@types/unist/-/unist-2.0.11.tgz", + "integrity": "sha512-CmBKiL6NNo/OqgmMn95Fk9Whlp2mtvIv+KNpQKN2F4SjvrEesubTRWGYSg+BnWZOnlCaSTU1sMpsBOzgbYhnsA==", + "license": "MIT" + }, + "node_modules/path-exists": { + "version": "4.0.0", + "resolved": "https://registry.npmjs.org/path-exists/-/path-exists-4.0.0.tgz", + "integrity": "sha512-ak9Qy5Q7jYb2Wwcey5Fpvg2KoAc/ZIhLSLOSBmRmygPsGwkVVt0fZa0qrtMz+m6tJTAHfZQ8FnmB4MG4LWy7/w==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">=8" + } + }, + "node_modules/path-key": { + "version": "3.1.1", + "resolved": "https://registry.npmjs.org/path-key/-/path-key-3.1.1.tgz", + "integrity": "sha512-ojmeN0qd+y0jszEtoY48r0Peq5dwMEkIlCOu6Q5f41lfkswXuKtYrhgoTpLnyIcHm24Uhqx+5Tqm2InSwLhE6Q==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">=8" + } + }, + "node_modules/path-parse": { + "version": "1.0.7", + "resolved": "https://registry.npmjs.org/path-parse/-/path-parse-1.0.7.tgz", + "integrity": "sha512-LDJzPVEEEPR+y48z93A0Ed0yXb8pAByGWo/k5YYdYgpY2/2EsOsksJrq7lOHxryrVOn1ejG6oAp8ahvOIQD8sw==", + "dev": true, + "license": "MIT" + }, + "node_modules/picocolors": { + "version": "1.1.1", + "resolved": "https://registry.npmjs.org/picocolors/-/picocolors-1.1.1.tgz", + "integrity": "sha512-xceH2snhtb5M9liqDsmEw56le376mTZkEX/jEb/RxNFyegNul7eNslCXP9FDj/Lcu0X8KEyMceP2ntpaHrDEVA==", + "dev": true, + "license": "ISC" + }, + "node_modules/picomatch": { + "version": "4.0.3", + "resolved": "https://registry.npmjs.org/picomatch/-/picomatch-4.0.3.tgz", + "integrity": "sha512-5gTmgEY/sqK6gFXLIsQNH19lWb4ebPDLA4SdLP7dsWkIXHWlG66oPuVvXSGFPppYZz8ZDZq0dYYrbHfBCVUb1Q==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">=12" + }, + "funding": { + "url": "https://github.com/sponsors/jonschlinkert" + } + }, + "node_modules/possible-typed-array-names": { + "version": "1.1.0", + "resolved": "https://registry.npmjs.org/possible-typed-array-names/-/possible-typed-array-names-1.1.0.tgz", + "integrity": "sha512-/+5VFTchJDoVj3bhoqi6UeymcD00DAwb1nJwamzPvHEszJ4FpF6SNNbUbOS8yI56qHzdV8eK0qEfOSiodkTdxg==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">= 0.4" + } + }, + "node_modules/postcss": { + "version": "8.5.6", + "resolved": "https://registry.npmjs.org/postcss/-/postcss-8.5.6.tgz", + "integrity": "sha512-3Ybi1tAuwAP9s0r1UQ2J4n5Y0G05bJkpUIO0/bI9MhwmD70S5aTWbXGBwxHrelT+XM1k6dM0pk+SwNkpTRN7Pg==", + "dev": true, + "funding": [ + { + "type": "opencollective", + "url": "https://opencollective.com/postcss/" + }, + { + "type": "tidelift", + "url": "https://tidelift.com/funding/github/npm/postcss" + }, + { + "type": "github", + "url": "https://github.com/sponsors/ai" + } + ], + "license": "MIT", + "dependencies": { + "nanoid": "^3.3.11", + "picocolors": "^1.1.1", + "source-map-js": "^1.2.1" + }, + "engines": { + "node": "^10 || ^12 || >=14" + } + }, + "node_modules/postcss-value-parser": { + "version": "4.2.0", + "resolved": "https://registry.npmjs.org/postcss-value-parser/-/postcss-value-parser-4.2.0.tgz", + "integrity": "sha512-1NNCs6uurfkVbeXG4S8JFT9t19m45ICnif8zWLd5oPSZ50QnwMfK+H3jv408d4jw/7Bttv5axS5IiHoLaVNHeQ==", + "dev": true, + "license": "MIT" + }, + "node_modules/prelude-ls": { + "version": "1.2.1", + "resolved": "https://registry.npmjs.org/prelude-ls/-/prelude-ls-1.2.1.tgz", + "integrity": "sha512-vkcDPrRZo1QZLbn5RLGPpg/WmIQ65qoWWhcGKf/b5eplkkarX0m9z8ppCat4mlOqUsWpyNuYgO3VRyrYHSzX5g==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">= 0.8.0" + } + }, + "node_modules/prop-types": { + "version": "15.8.1", + "resolved": "https://registry.npmjs.org/prop-types/-/prop-types-15.8.1.tgz", + "integrity": "sha512-oj87CgZICdulUohogVAR7AjlC0327U4el4L6eAvOqCeudMDVU0NThNaV+b9Df4dXgSP1gXMTnPdhfe/2qDH5cg==", + "dev": true, + "license": "MIT", + "dependencies": { + "loose-envify": "^1.4.0", + "object-assign": "^4.1.1", + "react-is": "^16.13.1" + } + }, + "node_modules/property-information": { + "version": "7.1.0", + "resolved": "https://registry.npmjs.org/property-information/-/property-information-7.1.0.tgz", + "integrity": "sha512-TwEZ+X+yCJmYfL7TPUOcvBZ4QfoT5YenQiJuX//0th53DE6w0xxLEtfK3iyryQFddXuvkIk51EEgrJQ0WJkOmQ==", + "license": "MIT", + "funding": { + "type": "github", + "url": "https://github.com/sponsors/wooorm" + } + }, + "node_modules/punycode": { + "version": "2.3.1", + "resolved": "https://registry.npmjs.org/punycode/-/punycode-2.3.1.tgz", + "integrity": "sha512-vYt7UD1U9Wg6138shLtLOvdAu+8DsC/ilFtEVHcH+wydcSpNE20AfSOduf6MkRFahL5FY7X1oU7nKVZFtfq8Fg==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">=6" + } + }, + "node_modules/react": { + "version": "18.3.1", + "resolved": "https://registry.npmjs.org/react/-/react-18.3.1.tgz", + "integrity": "sha512-wS+hAgJShR0KhEvPJArfuPVN1+Hz1t0Y6n5jLrGQbkb4urgPE/0Rve+1kMB1v/oWgHgm4WIcV+i7F2pTVj+2iQ==", + "license": "MIT", + "dependencies": { + "loose-envify": "^1.1.0" + }, + "engines": { + "node": ">=0.10.0" + } + }, + "node_modules/react-dom": { + "version": "18.3.1", + "resolved": "https://registry.npmjs.org/react-dom/-/react-dom-18.3.1.tgz", + "integrity": "sha512-5m4nQKp+rZRb09LNH59GM4BxTh9251/ylbKIbpe7TpGxfJ+9kv6BLkLBXIjjspbgbnIBNqlI23tRnTWT0snUIw==", + "license": "MIT", + "dependencies": { + "loose-envify": "^1.1.0", + "scheduler": "^0.23.2" + }, + "peerDependencies": { + "react": "^18.3.1" + } + }, + "node_modules/react-is": { + "version": "16.13.1", + "resolved": "https://registry.npmjs.org/react-is/-/react-is-16.13.1.tgz", + "integrity": "sha512-24e6ynE2H+OKt4kqsOvNd8kBpV65zoxbA4BVsEOB3ARVWQki/DHzaUoC5KuON/BiccDaCCTZBuOcfZs70kR8bQ==", + "dev": true, + "license": "MIT" + }, + "node_modules/react-markdown": { + "version": "10.1.0", + "resolved": "https://registry.npmjs.org/react-markdown/-/react-markdown-10.1.0.tgz", + "integrity": "sha512-qKxVopLT/TyA6BX3Ue5NwabOsAzm0Q7kAPwq6L+wWDwisYs7R8vZ0nRXqq6rkueboxpkjvLGU9fWifiX/ZZFxQ==", + "license": "MIT", + "dependencies": { + "@types/hast": "^3.0.0", + "@types/mdast": "^4.0.0", + "devlop": "^1.0.0", + "hast-util-to-jsx-runtime": "^2.0.0", + "html-url-attributes": "^3.0.0", + "mdast-util-to-hast": "^13.0.0", + "remark-parse": "^11.0.0", + "remark-rehype": "^11.0.0", + "unified": "^11.0.0", + "unist-util-visit": "^5.0.0", + "vfile": "^6.0.0" + }, + "funding": { + "type": "opencollective", + "url": "https://opencollective.com/unified" + }, + "peerDependencies": { + "@types/react": ">=18", + "react": ">=18" + } + }, + "node_modules/react-refresh": { + "version": "0.17.0", + "resolved": "https://registry.npmjs.org/react-refresh/-/react-refresh-0.17.0.tgz", + "integrity": "sha512-z6F7K9bV85EfseRCp2bzrpyQ0Gkw1uLoCel9XBVWPg/TjRj94SkJzUTGfOa4bs7iJvBWtQG0Wq7wnI0syw3EBQ==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">=0.10.0" + } + }, + "node_modules/reflect.getprototypeof": { + "version": "1.0.10", + "resolved": "https://registry.npmjs.org/reflect.getprototypeof/-/reflect.getprototypeof-1.0.10.tgz", + "integrity": "sha512-00o4I+DVrefhv+nX0ulyi3biSHCPDe+yLv5o/p6d/UVlirijB8E16FtfwSAi4g3tcqrQ4lRAqQSoFEZJehYEcw==", + "dev": true, + "license": "MIT", + "dependencies": { + "call-bind": "^1.0.8", + "define-properties": "^1.2.1", + "es-abstract": "^1.23.9", + "es-errors": "^1.3.0", + "es-object-atoms": "^1.0.0", + "get-intrinsic": "^1.2.7", + "get-proto": "^1.0.1", + "which-builtin-type": "^1.2.1" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/regexp.prototype.flags": { + "version": "1.5.4", + "resolved": "https://registry.npmjs.org/regexp.prototype.flags/-/regexp.prototype.flags-1.5.4.tgz", + "integrity": "sha512-dYqgNSZbDwkaJ2ceRd9ojCGjBq+mOm9LmtXnAnEGyHhN/5R7iDW2TRw3h+o/jCFxus3P2LfWIIiwowAjANm7IA==", + "dev": true, + "license": "MIT", + "dependencies": { + "call-bind": "^1.0.8", + "define-properties": "^1.2.1", + "es-errors": "^1.3.0", + "get-proto": "^1.0.1", + "gopd": "^1.2.0", + "set-function-name": "^2.0.2" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/remark-parse": { + "version": "11.0.0", + "resolved": "https://registry.npmjs.org/remark-parse/-/remark-parse-11.0.0.tgz", + "integrity": "sha512-FCxlKLNGknS5ba/1lmpYijMUzX2esxW5xQqjWxw2eHFfS2MSdaHVINFmhjo+qN1WhZhNimq0dZATN9pH0IDrpA==", + "license": "MIT", + "dependencies": { + "@types/mdast": "^4.0.0", + "mdast-util-from-markdown": "^2.0.0", + "micromark-util-types": "^2.0.0", + "unified": "^11.0.0" + }, + "funding": { + "type": "opencollective", + "url": "https://opencollective.com/unified" + } + }, + "node_modules/remark-rehype": { + "version": "11.1.2", + "resolved": "https://registry.npmjs.org/remark-rehype/-/remark-rehype-11.1.2.tgz", + "integrity": "sha512-Dh7l57ianaEoIpzbp0PC9UKAdCSVklD8E5Rpw7ETfbTl3FqcOOgq5q2LVDhgGCkaBv7p24JXikPdvhhmHvKMsw==", + "license": "MIT", + "dependencies": { + "@types/hast": "^3.0.0", + "@types/mdast": "^4.0.0", + "mdast-util-to-hast": "^13.0.0", + "unified": "^11.0.0", + "vfile": "^6.0.0" + }, + "funding": { + "type": "opencollective", + "url": "https://opencollective.com/unified" + } + }, + "node_modules/resolve": { + "version": "2.0.0-next.5", + "resolved": "https://registry.npmjs.org/resolve/-/resolve-2.0.0-next.5.tgz", + "integrity": "sha512-U7WjGVG9sH8tvjW5SmGbQuui75FiyjAX72HX15DwBBwF9dNiQZRQAg9nnPhYy+TUnE0+VcrttuvNI8oSxZcocA==", + "dev": true, + "license": "MIT", + "dependencies": { + "is-core-module": "^2.13.0", + "path-parse": "^1.0.7", + "supports-preserve-symlinks-flag": "^1.0.0" + }, + "bin": { + "resolve": "bin/resolve" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/resolve-from": { + "version": "4.0.0", + "resolved": "https://registry.npmjs.org/resolve-from/-/resolve-from-4.0.0.tgz", + "integrity": "sha512-pb/MYmXstAkysRFx8piNI1tGFNQIFA3vkE3Gq4EuA1dF6gHp/+vgZqsCGJapvy8N3Q+4o7FwvquPJcnZ7RYy4g==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">=4" + } + }, + "node_modules/rollup": { + "version": "4.55.1", + "resolved": "https://registry.npmjs.org/rollup/-/rollup-4.55.1.tgz", + "integrity": "sha512-wDv/Ht1BNHB4upNbK74s9usvl7hObDnvVzknxqY/E/O3X6rW1U1rV1aENEfJ54eFZDTNo7zv1f5N4edCluH7+A==", + "dev": true, + "license": "MIT", + "dependencies": { + "@types/estree": "1.0.8" + }, + "bin": { + "rollup": "dist/bin/rollup" + }, + "engines": { + "node": ">=18.0.0", + "npm": ">=8.0.0" + }, + "optionalDependencies": { + "@rollup/rollup-android-arm-eabi": "4.55.1", + "@rollup/rollup-android-arm64": "4.55.1", + "@rollup/rollup-darwin-arm64": "4.55.1", + "@rollup/rollup-darwin-x64": "4.55.1", + "@rollup/rollup-freebsd-arm64": "4.55.1", + "@rollup/rollup-freebsd-x64": "4.55.1", + "@rollup/rollup-linux-arm-gnueabihf": "4.55.1", + "@rollup/rollup-linux-arm-musleabihf": "4.55.1", + "@rollup/rollup-linux-arm64-gnu": "4.55.1", + "@rollup/rollup-linux-arm64-musl": "4.55.1", + "@rollup/rollup-linux-loong64-gnu": "4.55.1", + "@rollup/rollup-linux-loong64-musl": "4.55.1", + "@rollup/rollup-linux-ppc64-gnu": "4.55.1", + "@rollup/rollup-linux-ppc64-musl": "4.55.1", + "@rollup/rollup-linux-riscv64-gnu": "4.55.1", + "@rollup/rollup-linux-riscv64-musl": "4.55.1", + "@rollup/rollup-linux-s390x-gnu": "4.55.1", + "@rollup/rollup-linux-x64-gnu": "4.55.1", + "@rollup/rollup-linux-x64-musl": "4.55.1", + "@rollup/rollup-openbsd-x64": "4.55.1", + "@rollup/rollup-openharmony-arm64": "4.55.1", + "@rollup/rollup-win32-arm64-msvc": "4.55.1", + "@rollup/rollup-win32-ia32-msvc": "4.55.1", + "@rollup/rollup-win32-x64-gnu": "4.55.1", + "@rollup/rollup-win32-x64-msvc": "4.55.1", + "fsevents": "~2.3.2" + } + }, + "node_modules/safe-array-concat": { + "version": "1.1.3", + "resolved": "https://registry.npmjs.org/safe-array-concat/-/safe-array-concat-1.1.3.tgz", + "integrity": "sha512-AURm5f0jYEOydBj7VQlVvDrjeFgthDdEF5H1dP+6mNpoXOMo1quQqJ4wvJDyRZ9+pO3kGWoOdmV08cSv2aJV6Q==", + "dev": true, + "license": "MIT", + "dependencies": { + "call-bind": "^1.0.8", + "call-bound": "^1.0.2", + "get-intrinsic": "^1.2.6", + "has-symbols": "^1.1.0", + "isarray": "^2.0.5" + }, + "engines": { + "node": ">=0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/safe-push-apply": { + "version": "1.0.0", + "resolved": "https://registry.npmjs.org/safe-push-apply/-/safe-push-apply-1.0.0.tgz", + "integrity": "sha512-iKE9w/Z7xCzUMIZqdBsp6pEQvwuEebH4vdpjcDWnyzaI6yl6O9FHvVpmGelvEHNsoY6wGblkxR6Zty/h00WiSA==", + "dev": true, + "license": "MIT", + "dependencies": { + "es-errors": "^1.3.0", + "isarray": "^2.0.5" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/safe-regex-test": { + "version": "1.1.0", + "resolved": "https://registry.npmjs.org/safe-regex-test/-/safe-regex-test-1.1.0.tgz", + "integrity": "sha512-x/+Cz4YrimQxQccJf5mKEbIa1NzeCRNI5Ecl/ekmlYaampdNLPalVyIcCZNNH3MvmqBugV5TMYZXv0ljslUlaw==", + "dev": true, + "license": "MIT", + "dependencies": { + "call-bound": "^1.0.2", + "es-errors": "^1.3.0", + "is-regex": "^1.2.1" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/scheduler": { + "version": "0.23.2", + "resolved": "https://registry.npmjs.org/scheduler/-/scheduler-0.23.2.tgz", + "integrity": "sha512-UOShsPwz7NrMUqhR6t0hWjFduvOzbtv7toDH1/hIrfRNIDBnnBWd0CwJTGvTpngVlmwGCdP9/Zl/tVrDqcuYzQ==", + "license": "MIT", + "dependencies": { + "loose-envify": "^1.1.0" + } + }, + "node_modules/semver": { + "version": "6.3.1", + "resolved": "https://registry.npmjs.org/semver/-/semver-6.3.1.tgz", + "integrity": "sha512-BR7VvDCVHO+q2xBEWskxS6DJE1qRnb7DxzUrogb71CWoSficBxYsiAGd+Kl0mmq/MprG9yArRkyrQxTO6XjMzA==", + "dev": true, + "license": "ISC", + "bin": { + "semver": "bin/semver.js" + } + }, + "node_modules/set-function-length": { + "version": "1.2.2", + "resolved": "https://registry.npmjs.org/set-function-length/-/set-function-length-1.2.2.tgz", + "integrity": "sha512-pgRc4hJ4/sNjWCSS9AmnS40x3bNMDTknHgL5UaMBTMyJnU90EgWh1Rz+MC9eFu4BuN/UwZjKQuY/1v3rM7HMfg==", + "dev": true, + "license": "MIT", + "dependencies": { + "define-data-property": "^1.1.4", + "es-errors": "^1.3.0", + "function-bind": "^1.1.2", + "get-intrinsic": "^1.2.4", + "gopd": "^1.0.1", + "has-property-descriptors": "^1.0.2" + }, + "engines": { + "node": ">= 0.4" + } + }, + "node_modules/set-function-name": { + "version": "2.0.2", + "resolved": "https://registry.npmjs.org/set-function-name/-/set-function-name-2.0.2.tgz", + "integrity": "sha512-7PGFlmtwsEADb0WYyvCMa1t+yke6daIG4Wirafur5kcf+MhUnPms1UeR0CKQdTZD81yESwMHbtn+TR+dMviakQ==", + "dev": true, + "license": "MIT", + "dependencies": { + "define-data-property": "^1.1.4", + "es-errors": "^1.3.0", + "functions-have-names": "^1.2.3", + "has-property-descriptors": "^1.0.2" + }, + "engines": { + "node": ">= 0.4" + } + }, + "node_modules/set-proto": { + "version": "1.0.0", + "resolved": "https://registry.npmjs.org/set-proto/-/set-proto-1.0.0.tgz", + "integrity": "sha512-RJRdvCo6IAnPdsvP/7m6bsQqNnn1FCBX5ZNtFL98MmFF/4xAIJTIg1YbHW5DC2W5SKZanrC6i4HsJqlajw/dZw==", + "dev": true, + "license": "MIT", + "dependencies": { + "dunder-proto": "^1.0.1", + "es-errors": "^1.3.0", + "es-object-atoms": "^1.0.0" + }, + "engines": { + "node": ">= 0.4" + } + }, + "node_modules/shebang-command": { + "version": "2.0.0", + "resolved": "https://registry.npmjs.org/shebang-command/-/shebang-command-2.0.0.tgz", + "integrity": "sha512-kHxr2zZpYtdmrN1qDjrrX/Z1rR1kG8Dx+gkpK1G4eXmvXswmcE1hTWBWYUzlraYw1/yZp6YuDY77YtvbN0dmDA==", + "dev": true, + "license": "MIT", + "dependencies": { + "shebang-regex": "^3.0.0" + }, + "engines": { + "node": ">=8" + } + }, + "node_modules/shebang-regex": { + "version": "3.0.0", + "resolved": "https://registry.npmjs.org/shebang-regex/-/shebang-regex-3.0.0.tgz", + "integrity": "sha512-7++dFhtcx3353uBaq8DDR4NuxBetBzC7ZQOhmTQInHEd6bSrXdiEyzCvG07Z44UYdLShWUyXt5M/yhz8ekcb1A==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">=8" + } + }, + "node_modules/side-channel": { + "version": "1.1.0", + "resolved": "https://registry.npmjs.org/side-channel/-/side-channel-1.1.0.tgz", + "integrity": "sha512-ZX99e6tRweoUXqR+VBrslhda51Nh5MTQwou5tnUDgbtyM0dBgmhEDtWGP/xbKn6hqfPRHujUNwz5fy/wbbhnpw==", + "dev": true, + "license": "MIT", + "dependencies": { + "es-errors": "^1.3.0", + "object-inspect": "^1.13.3", + "side-channel-list": "^1.0.0", + "side-channel-map": "^1.0.1", + "side-channel-weakmap": "^1.0.2" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/side-channel-list": { + "version": "1.0.0", + "resolved": "https://registry.npmjs.org/side-channel-list/-/side-channel-list-1.0.0.tgz", + "integrity": "sha512-FCLHtRD/gnpCiCHEiJLOwdmFP+wzCmDEkc9y7NsYxeF4u7Btsn1ZuwgwJGxImImHicJArLP4R0yX4c2KCrMrTA==", + "dev": true, + "license": "MIT", + "dependencies": { + "es-errors": "^1.3.0", + "object-inspect": "^1.13.3" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/side-channel-map": { + "version": "1.0.1", + "resolved": "https://registry.npmjs.org/side-channel-map/-/side-channel-map-1.0.1.tgz", + "integrity": "sha512-VCjCNfgMsby3tTdo02nbjtM/ewra6jPHmpThenkTYh8pG9ucZ/1P8So4u4FGBek/BjpOVsDCMoLA/iuBKIFXRA==", + "dev": true, + "license": "MIT", + "dependencies": { + "call-bound": "^1.0.2", + "es-errors": "^1.3.0", + "get-intrinsic": "^1.2.5", + "object-inspect": "^1.13.3" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/side-channel-weakmap": { + "version": "1.0.2", + "resolved": "https://registry.npmjs.org/side-channel-weakmap/-/side-channel-weakmap-1.0.2.tgz", + "integrity": "sha512-WPS/HvHQTYnHisLo9McqBHOJk2FkHO/tlpvldyrnem4aeQp4hai3gythswg6p01oSoTl58rcpiFAjF2br2Ak2A==", + "dev": true, + "license": "MIT", + "dependencies": { + "call-bound": "^1.0.2", + "es-errors": "^1.3.0", + "get-intrinsic": "^1.2.5", + "object-inspect": "^1.13.3", + "side-channel-map": "^1.0.1" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/source-map-js": { + "version": "1.2.1", + "resolved": "https://registry.npmjs.org/source-map-js/-/source-map-js-1.2.1.tgz", + "integrity": "sha512-UXWMKhLOwVKb728IUtQPXxfYU+usdybtUrK/8uGE8CQMvrhOpwvzDBwj0QhSL7MQc7vIsISBG8VQ8+IDQxpfQA==", + "dev": true, + "license": "BSD-3-Clause", + "engines": { + "node": ">=0.10.0" + } + }, + "node_modules/space-separated-tokens": { + "version": "2.0.2", + "resolved": "https://registry.npmjs.org/space-separated-tokens/-/space-separated-tokens-2.0.2.tgz", + "integrity": "sha512-PEGlAwrG8yXGXRjW32fGbg66JAlOAwbObuqVoJpv/mRgoWDQfgH1wDPvtzWyUSNAXBGSk8h755YDbbcEy3SH2Q==", + "license": "MIT", + "funding": { + "type": "github", + "url": "https://github.com/sponsors/wooorm" + } + }, + "node_modules/stop-iteration-iterator": { + "version": "1.1.0", + "resolved": "https://registry.npmjs.org/stop-iteration-iterator/-/stop-iteration-iterator-1.1.0.tgz", + "integrity": "sha512-eLoXW/DHyl62zxY4SCaIgnRhuMr6ri4juEYARS8E6sCEqzKpOiE521Ucofdx+KnDZl5xmvGYaaKCk5FEOxJCoQ==", + "dev": true, + "license": "MIT", + "dependencies": { + "es-errors": "^1.3.0", + "internal-slot": "^1.1.0" + }, + "engines": { + "node": ">= 0.4" + } + }, + "node_modules/string.prototype.matchall": { + "version": "4.0.12", + "resolved": "https://registry.npmjs.org/string.prototype.matchall/-/string.prototype.matchall-4.0.12.tgz", + "integrity": "sha512-6CC9uyBL+/48dYizRf7H7VAYCMCNTBeM78x/VTUe9bFEaxBepPJDa1Ow99LqI/1yF7kuy7Q3cQsYMrcjGUcskA==", + "dev": true, + "license": "MIT", + "dependencies": { + "call-bind": "^1.0.8", + "call-bound": "^1.0.3", + "define-properties": "^1.2.1", + "es-abstract": "^1.23.6", + "es-errors": "^1.3.0", + "es-object-atoms": "^1.0.0", + "get-intrinsic": "^1.2.6", + "gopd": "^1.2.0", + "has-symbols": "^1.1.0", + "internal-slot": "^1.1.0", + "regexp.prototype.flags": "^1.5.3", + "set-function-name": "^2.0.2", + "side-channel": "^1.1.0" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/string.prototype.repeat": { + "version": "1.0.0", + "resolved": "https://registry.npmjs.org/string.prototype.repeat/-/string.prototype.repeat-1.0.0.tgz", + "integrity": "sha512-0u/TldDbKD8bFCQ/4f5+mNRrXwZ8hg2w7ZR8wa16e8z9XpePWl3eGEcUD0OXpEH/VJH/2G3gjUtR3ZOiBe2S/w==", + "dev": true, + "license": "MIT", + "dependencies": { + "define-properties": "^1.1.3", + "es-abstract": "^1.17.5" + } + }, + "node_modules/string.prototype.trim": { + "version": "1.2.10", + "resolved": "https://registry.npmjs.org/string.prototype.trim/-/string.prototype.trim-1.2.10.tgz", + "integrity": "sha512-Rs66F0P/1kedk5lyYyH9uBzuiI/kNRmwJAR9quK6VOtIpZ2G+hMZd+HQbbv25MgCA6gEffoMZYxlTod4WcdrKA==", + "dev": true, + "license": "MIT", + "dependencies": { + "call-bind": "^1.0.8", + "call-bound": "^1.0.2", + "define-data-property": "^1.1.4", + "define-properties": "^1.2.1", + "es-abstract": "^1.23.5", + "es-object-atoms": "^1.0.0", + "has-property-descriptors": "^1.0.2" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/string.prototype.trimend": { + "version": "1.0.9", + "resolved": "https://registry.npmjs.org/string.prototype.trimend/-/string.prototype.trimend-1.0.9.tgz", + "integrity": "sha512-G7Ok5C6E/j4SGfyLCloXTrngQIQU3PWtXGst3yM7Bea9FRURf1S42ZHlZZtsNque2FN2PoUhfZXYLNWwEr4dLQ==", + "dev": true, + "license": "MIT", + "dependencies": { + "call-bind": "^1.0.8", + "call-bound": "^1.0.2", + "define-properties": "^1.2.1", + "es-object-atoms": "^1.0.0" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/string.prototype.trimstart": { + "version": "1.0.8", + "resolved": "https://registry.npmjs.org/string.prototype.trimstart/-/string.prototype.trimstart-1.0.8.tgz", + "integrity": "sha512-UXSH262CSZY1tfu3G3Secr6uGLCFVPMhIqHjlgCUtCCcgihYc/xKs9djMTMUOb2j1mVSeU8EU6NWc/iQKU6Gfg==", + "dev": true, + "license": "MIT", + "dependencies": { + "call-bind": "^1.0.7", + "define-properties": "^1.2.1", + "es-object-atoms": "^1.0.0" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/stringify-entities": { + "version": "4.0.4", + "resolved": "https://registry.npmjs.org/stringify-entities/-/stringify-entities-4.0.4.tgz", + "integrity": "sha512-IwfBptatlO+QCJUo19AqvrPNqlVMpW9YEL2LIVY+Rpv2qsjCGxaDLNRgeGsQWJhfItebuJhsGSLjaBbNSQ+ieg==", + "license": "MIT", + "dependencies": { + "character-entities-html4": "^2.0.0", + "character-entities-legacy": "^3.0.0" + }, + "funding": { + "type": "github", + "url": "https://github.com/sponsors/wooorm" + } + }, + "node_modules/strip-json-comments": { + "version": "3.1.1", + "resolved": "https://registry.npmjs.org/strip-json-comments/-/strip-json-comments-3.1.1.tgz", + "integrity": "sha512-6fPc+R4ihwqP6N/aIv2f1gMH8lOVtWQHoqC4yK6oSDVVocumAsfCqjkXnqiYMhmMwS/mEHLp7Vehlt3ql6lEig==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">=8" + }, + "funding": { + "url": "https://github.com/sponsors/sindresorhus" + } + }, + "node_modules/style-to-js": { + "version": "1.1.21", + "resolved": "https://registry.npmjs.org/style-to-js/-/style-to-js-1.1.21.tgz", + "integrity": "sha512-RjQetxJrrUJLQPHbLku6U/ocGtzyjbJMP9lCNK7Ag0CNh690nSH8woqWH9u16nMjYBAok+i7JO1NP2pOy8IsPQ==", + "license": "MIT", + "dependencies": { + "style-to-object": "1.0.14" + } + }, + "node_modules/style-to-object": { + "version": "1.0.14", + "resolved": "https://registry.npmjs.org/style-to-object/-/style-to-object-1.0.14.tgz", + "integrity": "sha512-LIN7rULI0jBscWQYaSswptyderlarFkjQ+t79nzty8tcIAceVomEVlLzH5VP4Cmsv6MtKhs7qaAiwlcp+Mgaxw==", + "license": "MIT", + "dependencies": { + "inline-style-parser": "0.2.7" + } + }, + "node_modules/supports-color": { + "version": "7.2.0", + "resolved": "https://registry.npmjs.org/supports-color/-/supports-color-7.2.0.tgz", + "integrity": "sha512-qpCAvRl9stuOHveKsn7HncJRvv501qIacKzQlO/+Lwxc9+0q2wLyv4Dfvt80/DPn2pqOBsJdDiogXGR9+OvwRw==", + "dev": true, + "license": "MIT", + "dependencies": { + "has-flag": "^4.0.0" + }, + "engines": { + "node": ">=8" + } + }, + "node_modules/supports-preserve-symlinks-flag": { + "version": "1.0.0", + "resolved": "https://registry.npmjs.org/supports-preserve-symlinks-flag/-/supports-preserve-symlinks-flag-1.0.0.tgz", + "integrity": "sha512-ot0WnXS9fgdkgIcePe6RHNk1WA8+muPa6cSjeR3V8K27q9BB1rTE3R1p7Hv0z1ZyAc8s6Vvv8DIyWf681MAt0w==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/tailwindcss": { + "version": "4.1.18", + "resolved": "https://registry.npmjs.org/tailwindcss/-/tailwindcss-4.1.18.tgz", + "integrity": "sha512-4+Z+0yiYyEtUVCScyfHCxOYP06L5Ne+JiHhY2IjR2KWMIWhJOYZKLSGZaP5HkZ8+bY0cxfzwDE5uOmzFXyIwxw==", + "license": "MIT" + }, + "node_modules/tapable": { + "version": "2.3.0", + "resolved": "https://registry.npmjs.org/tapable/-/tapable-2.3.0.tgz", + "integrity": "sha512-g9ljZiwki/LfxmQADO3dEY1CbpmXT5Hm2fJ+QaGKwSXUylMybePR7/67YW7jOrrvjEgL1Fmz5kzyAjWVWLlucg==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">=6" + }, + "funding": { + "type": "opencollective", + "url": "https://opencollective.com/webpack" + } + }, + "node_modules/tinyglobby": { + "version": "0.2.15", + "resolved": "https://registry.npmjs.org/tinyglobby/-/tinyglobby-0.2.15.tgz", + "integrity": "sha512-j2Zq4NyQYG5XMST4cbs02Ak8iJUdxRM0XI5QyxXuZOzKOINmWurp3smXu3y5wDcJrptwpSjgXHzIQxR0omXljQ==", + "dev": true, + "license": "MIT", + "dependencies": { + "fdir": "^6.5.0", + "picomatch": "^4.0.3" + }, + "engines": { + "node": ">=12.0.0" + }, + "funding": { + "url": "https://github.com/sponsors/SuperchupuDev" + } + }, + "node_modules/trim-lines": { + "version": "3.0.1", + "resolved": "https://registry.npmjs.org/trim-lines/-/trim-lines-3.0.1.tgz", + "integrity": "sha512-kRj8B+YHZCc9kQYdWfJB2/oUl9rA99qbowYYBtr4ui4mZyAQ2JpvVBd/6U2YloATfqBhBTSMhTpgBHtU0Mf3Rg==", + "license": "MIT", + "funding": { + "type": "github", + "url": "https://github.com/sponsors/wooorm" + } + }, + "node_modules/trough": { + "version": "2.2.0", + "resolved": "https://registry.npmjs.org/trough/-/trough-2.2.0.tgz", + "integrity": "sha512-tmMpK00BjZiUyVyvrBK7knerNgmgvcV/KLVyuma/SC+TQN167GrMRciANTz09+k3zW8L8t60jWO1GpfkZdjTaw==", + "license": "MIT", + "funding": { + "type": "github", + "url": "https://github.com/sponsors/wooorm" + } + }, + "node_modules/type-check": { + "version": "0.4.0", + "resolved": "https://registry.npmjs.org/type-check/-/type-check-0.4.0.tgz", + "integrity": "sha512-XleUoc9uwGXqjWwXaUTZAmzMcFZ5858QA2vvx1Ur5xIcixXIP+8LnFDgRplU30us6teqdlskFfu+ae4K79Ooew==", + "dev": true, + "license": "MIT", + "dependencies": { + "prelude-ls": "^1.2.1" + }, + "engines": { + "node": ">= 0.8.0" + } + }, + "node_modules/typed-array-buffer": { + "version": "1.0.3", + "resolved": "https://registry.npmjs.org/typed-array-buffer/-/typed-array-buffer-1.0.3.tgz", + "integrity": "sha512-nAYYwfY3qnzX30IkA6AQZjVbtK6duGontcQm1WSG1MD94YLqK0515GNApXkoxKOWMusVssAHWLh9SeaoefYFGw==", + "dev": true, + "license": "MIT", + "dependencies": { + "call-bound": "^1.0.3", + "es-errors": "^1.3.0", + "is-typed-array": "^1.1.14" + }, + "engines": { + "node": ">= 0.4" + } + }, + "node_modules/typed-array-byte-length": { + "version": "1.0.3", + "resolved": "https://registry.npmjs.org/typed-array-byte-length/-/typed-array-byte-length-1.0.3.tgz", + "integrity": "sha512-BaXgOuIxz8n8pIq3e7Atg/7s+DpiYrxn4vdot3w9KbnBhcRQq6o3xemQdIfynqSeXeDrF32x+WvfzmOjPiY9lg==", + "dev": true, + "license": "MIT", + "dependencies": { + "call-bind": "^1.0.8", + "for-each": "^0.3.3", + "gopd": "^1.2.0", + "has-proto": "^1.2.0", + "is-typed-array": "^1.1.14" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/typed-array-byte-offset": { + "version": "1.0.4", + "resolved": "https://registry.npmjs.org/typed-array-byte-offset/-/typed-array-byte-offset-1.0.4.tgz", + "integrity": "sha512-bTlAFB/FBYMcuX81gbL4OcpH5PmlFHqlCCpAl8AlEzMz5k53oNDvN8p1PNOWLEmI2x4orp3raOFB51tv9X+MFQ==", + "dev": true, + "license": "MIT", + "dependencies": { + "available-typed-arrays": "^1.0.7", + "call-bind": "^1.0.8", + "for-each": "^0.3.3", + "gopd": "^1.2.0", + "has-proto": "^1.2.0", + "is-typed-array": "^1.1.15", + "reflect.getprototypeof": "^1.0.9" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/typed-array-length": { + "version": "1.0.7", + "resolved": "https://registry.npmjs.org/typed-array-length/-/typed-array-length-1.0.7.tgz", + "integrity": "sha512-3KS2b+kL7fsuk/eJZ7EQdnEmQoaho/r6KUef7hxvltNA5DR8NAUM+8wJMbJyZ4G9/7i3v5zPBIMN5aybAh2/Jg==", + "dev": true, + "license": "MIT", + "dependencies": { + "call-bind": "^1.0.7", + "for-each": "^0.3.3", + "gopd": "^1.0.1", + "is-typed-array": "^1.1.13", + "possible-typed-array-names": "^1.0.0", + "reflect.getprototypeof": "^1.0.6" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/unbox-primitive": { + "version": "1.1.0", + "resolved": "https://registry.npmjs.org/unbox-primitive/-/unbox-primitive-1.1.0.tgz", + "integrity": "sha512-nWJ91DjeOkej/TA8pXQ3myruKpKEYgqvpw9lz4OPHj/NWFNluYrjbz9j01CJ8yKQd2g4jFoOkINCTW2I5LEEyw==", + "dev": true, + "license": "MIT", + "dependencies": { + "call-bound": "^1.0.3", + "has-bigints": "^1.0.2", + "has-symbols": "^1.1.0", + "which-boxed-primitive": "^1.1.1" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/unified": { + "version": "11.0.5", + "resolved": "https://registry.npmjs.org/unified/-/unified-11.0.5.tgz", + "integrity": "sha512-xKvGhPWw3k84Qjh8bI3ZeJjqnyadK+GEFtazSfZv/rKeTkTjOJho6mFqh2SM96iIcZokxiOpg78GazTSg8+KHA==", + "license": "MIT", + "dependencies": { + "@types/unist": "^3.0.0", + "bail": "^2.0.0", + "devlop": "^1.0.0", + "extend": "^3.0.0", + "is-plain-obj": "^4.0.0", + "trough": "^2.0.0", + "vfile": "^6.0.0" + }, + "funding": { + "type": "opencollective", + "url": "https://opencollective.com/unified" + } + }, + "node_modules/unist-util-is": { + "version": "6.0.1", + "resolved": "https://registry.npmjs.org/unist-util-is/-/unist-util-is-6.0.1.tgz", + "integrity": "sha512-LsiILbtBETkDz8I9p1dQ0uyRUWuaQzd/cuEeS1hoRSyW5E5XGmTzlwY1OrNzzakGowI9Dr/I8HVaw4hTtnxy8g==", + "license": "MIT", + "dependencies": { + "@types/unist": "^3.0.0" + }, + "funding": { + "type": "opencollective", + "url": "https://opencollective.com/unified" + } + }, + "node_modules/unist-util-position": { + "version": "5.0.0", + "resolved": "https://registry.npmjs.org/unist-util-position/-/unist-util-position-5.0.0.tgz", + "integrity": "sha512-fucsC7HjXvkB5R3kTCO7kUjRdrS0BJt3M/FPxmHMBOm8JQi2BsHAHFsy27E0EolP8rp0NzXsJ+jNPyDWvOJZPA==", + "license": "MIT", + "dependencies": { + "@types/unist": "^3.0.0" + }, + "funding": { + "type": "opencollective", + "url": "https://opencollective.com/unified" + } + }, + "node_modules/unist-util-stringify-position": { + "version": "4.0.0", + "resolved": "https://registry.npmjs.org/unist-util-stringify-position/-/unist-util-stringify-position-4.0.0.tgz", + "integrity": "sha512-0ASV06AAoKCDkS2+xw5RXJywruurpbC4JZSm7nr7MOt1ojAzvyyaO+UxZf18j8FCF6kmzCZKcAgN/yu2gm2XgQ==", + "license": "MIT", + "dependencies": { + "@types/unist": "^3.0.0" + }, + "funding": { + "type": "opencollective", + "url": "https://opencollective.com/unified" + } + }, + "node_modules/unist-util-visit": { + "version": "5.0.0", + "resolved": "https://registry.npmjs.org/unist-util-visit/-/unist-util-visit-5.0.0.tgz", + "integrity": "sha512-MR04uvD+07cwl/yhVuVWAtw+3GOR/knlL55Nd/wAdblk27GCVt3lqpTivy/tkJcZoNPzTwS1Y+KMojlLDhoTzg==", + "license": "MIT", + "dependencies": { + "@types/unist": "^3.0.0", + "unist-util-is": "^6.0.0", + "unist-util-visit-parents": "^6.0.0" + }, + "funding": { + "type": "opencollective", + "url": "https://opencollective.com/unified" + } + }, + "node_modules/unist-util-visit-parents": { + "version": "6.0.2", + "resolved": "https://registry.npmjs.org/unist-util-visit-parents/-/unist-util-visit-parents-6.0.2.tgz", + "integrity": "sha512-goh1s1TBrqSqukSc8wrjwWhL0hiJxgA8m4kFxGlQ+8FYQ3C/m11FcTs4YYem7V664AhHVvgoQLk890Ssdsr2IQ==", + "license": "MIT", + "dependencies": { + "@types/unist": "^3.0.0", + "unist-util-is": "^6.0.0" + }, + "funding": { + "type": "opencollective", + "url": "https://opencollective.com/unified" + } + }, + "node_modules/update-browserslist-db": { + "version": "1.2.3", + "resolved": "https://registry.npmjs.org/update-browserslist-db/-/update-browserslist-db-1.2.3.tgz", + "integrity": "sha512-Js0m9cx+qOgDxo0eMiFGEueWztz+d4+M3rGlmKPT+T4IS/jP4ylw3Nwpu6cpTTP8R1MAC1kF4VbdLt3ARf209w==", + "dev": true, + "funding": [ + { + "type": "opencollective", + "url": "https://opencollective.com/browserslist" + }, + { + "type": "tidelift", + "url": "https://tidelift.com/funding/github/npm/browserslist" + }, + { + "type": "github", + "url": "https://github.com/sponsors/ai" + } + ], + "license": "MIT", + "dependencies": { + "escalade": "^3.2.0", + "picocolors": "^1.1.1" + }, + "bin": { + "update-browserslist-db": "cli.js" + }, + "peerDependencies": { + "browserslist": ">= 4.21.0" + } + }, + "node_modules/uri-js": { + "version": "4.4.1", + "resolved": "https://registry.npmjs.org/uri-js/-/uri-js-4.4.1.tgz", + "integrity": "sha512-7rKUyy33Q1yc98pQ1DAmLtwX109F7TIfWlW1Ydo8Wl1ii1SeHieeh0HHfPeL2fMXK6z0s8ecKs9frCuLJvndBg==", + "dev": true, + "license": "BSD-2-Clause", + "dependencies": { + "punycode": "^2.1.0" + } + }, + "node_modules/vfile": { + "version": "6.0.3", + "resolved": "https://registry.npmjs.org/vfile/-/vfile-6.0.3.tgz", + "integrity": "sha512-KzIbH/9tXat2u30jf+smMwFCsno4wHVdNmzFyL+T/L3UGqqk6JKfVqOFOZEpZSHADH1k40ab6NUIXZq422ov3Q==", + "license": "MIT", + "dependencies": { + "@types/unist": "^3.0.0", + "vfile-message": "^4.0.0" + }, + "funding": { + "type": "opencollective", + "url": "https://opencollective.com/unified" + } + }, + "node_modules/vfile-message": { + "version": "4.0.3", + "resolved": "https://registry.npmjs.org/vfile-message/-/vfile-message-4.0.3.tgz", + "integrity": "sha512-QTHzsGd1EhbZs4AsQ20JX1rC3cOlt/IWJruk893DfLRr57lcnOeMaWG4K0JrRta4mIJZKth2Au3mM3u03/JWKw==", + "license": "MIT", + "dependencies": { + "@types/unist": "^3.0.0", + "unist-util-stringify-position": "^4.0.0" + }, + "funding": { + "type": "opencollective", + "url": "https://opencollective.com/unified" + } + }, + "node_modules/vite": { + "version": "6.4.1", + "resolved": "https://registry.npmjs.org/vite/-/vite-6.4.1.tgz", + "integrity": "sha512-+Oxm7q9hDoLMyJOYfUYBuHQo+dkAloi33apOPP56pzj+vsdJDzr+j1NISE5pyaAuKL4A3UD34qd0lx5+kfKp2g==", + "dev": true, + "license": "MIT", + "dependencies": { + "esbuild": "^0.25.0", + "fdir": "^6.4.4", + "picomatch": "^4.0.2", + "postcss": "^8.5.3", + "rollup": "^4.34.9", + "tinyglobby": "^0.2.13" + }, + "bin": { + "vite": "bin/vite.js" + }, + "engines": { + "node": "^18.0.0 || ^20.0.0 || >=22.0.0" + }, + "funding": { + "url": "https://github.com/vitejs/vite?sponsor=1" + }, + "optionalDependencies": { + "fsevents": "~2.3.3" + }, + "peerDependencies": { + "@types/node": "^18.0.0 || ^20.0.0 || >=22.0.0", + "jiti": ">=1.21.0", + "less": "*", + "lightningcss": "^1.21.0", + "sass": "*", + "sass-embedded": "*", + "stylus": "*", + "sugarss": "*", + "terser": "^5.16.0", + "tsx": "^4.8.1", + "yaml": "^2.4.2" + }, + "peerDependenciesMeta": { + "@types/node": { + "optional": true + }, + "jiti": { + "optional": true + }, + "less": { + "optional": true + }, + "lightningcss": { + "optional": true + }, + "sass": { + "optional": true + }, + "sass-embedded": { + "optional": true + }, + "stylus": { + "optional": true + }, + "sugarss": { + "optional": true + }, + "terser": { + "optional": true + }, + "tsx": { + "optional": true + }, + "yaml": { + "optional": true + } + } + }, + "node_modules/which": { + "version": "2.0.2", + "resolved": "https://registry.npmjs.org/which/-/which-2.0.2.tgz", + "integrity": "sha512-BLI3Tl1TW3Pvl70l3yq3Y64i+awpwXqsGBYWkkqMtnbXgrMD+yj7rhW0kuEDxzJaYXGjEW5ogapKNMEKNMjibA==", + "dev": true, + "license": "ISC", + "dependencies": { + "isexe": "^2.0.0" + }, + "bin": { + "node-which": "bin/node-which" + }, + "engines": { + "node": ">= 8" + } + }, + "node_modules/which-boxed-primitive": { + "version": "1.1.1", + "resolved": "https://registry.npmjs.org/which-boxed-primitive/-/which-boxed-primitive-1.1.1.tgz", + "integrity": "sha512-TbX3mj8n0odCBFVlY8AxkqcHASw3L60jIuF8jFP78az3C2YhmGvqbHBpAjTRH2/xqYunrJ9g1jSyjCjpoWzIAA==", + "dev": true, + "license": "MIT", + "dependencies": { + "is-bigint": "^1.1.0", + "is-boolean-object": "^1.2.1", + "is-number-object": "^1.1.1", + "is-string": "^1.1.1", + "is-symbol": "^1.1.1" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/which-builtin-type": { + "version": "1.2.1", + "resolved": "https://registry.npmjs.org/which-builtin-type/-/which-builtin-type-1.2.1.tgz", + "integrity": "sha512-6iBczoX+kDQ7a3+YJBnh3T+KZRxM/iYNPXicqk66/Qfm1b93iu+yOImkg0zHbj5LNOcNv1TEADiZ0xa34B4q6Q==", + "dev": true, + "license": "MIT", + "dependencies": { + "call-bound": "^1.0.2", + "function.prototype.name": "^1.1.6", + "has-tostringtag": "^1.0.2", + "is-async-function": "^2.0.0", + "is-date-object": "^1.1.0", + "is-finalizationregistry": "^1.1.0", + "is-generator-function": "^1.0.10", + "is-regex": "^1.2.1", + "is-weakref": "^1.0.2", + "isarray": "^2.0.5", + "which-boxed-primitive": "^1.1.0", + "which-collection": "^1.0.2", + "which-typed-array": "^1.1.16" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/which-collection": { + "version": "1.0.2", + "resolved": "https://registry.npmjs.org/which-collection/-/which-collection-1.0.2.tgz", + "integrity": "sha512-K4jVyjnBdgvc86Y6BkaLZEN933SwYOuBFkdmBu9ZfkcAbdVbpITnDmjvZ/aQjRXQrv5EPkTnD1s39GiiqbngCw==", + "dev": true, + "license": "MIT", + "dependencies": { + "is-map": "^2.0.3", + "is-set": "^2.0.3", + "is-weakmap": "^2.0.2", + "is-weakset": "^2.0.3" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/which-typed-array": { + "version": "1.1.20", + "resolved": "https://registry.npmjs.org/which-typed-array/-/which-typed-array-1.1.20.tgz", + "integrity": "sha512-LYfpUkmqwl0h9A2HL09Mms427Q1RZWuOHsukfVcKRq9q95iQxdw0ix1JQrqbcDR9PH1QDwf5Qo8OZb5lksZ8Xg==", + "dev": true, + "license": "MIT", + "dependencies": { + "available-typed-arrays": "^1.0.7", + "call-bind": "^1.0.8", + "call-bound": "^1.0.4", + "for-each": "^0.3.5", + "get-proto": "^1.0.1", + "gopd": "^1.2.0", + "has-tostringtag": "^1.0.2" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/word-wrap": { + "version": "1.2.5", + "resolved": "https://registry.npmjs.org/word-wrap/-/word-wrap-1.2.5.tgz", + "integrity": "sha512-BN22B5eaMMI9UMtjrGd5g5eCYPpCPDUy0FJXbYsaT5zYxjFOckS53SQDE3pWkVoWpHXVb3BrYcEN4Twa55B5cA==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">=0.10.0" + } + }, + "node_modules/yallist": { + "version": "3.1.1", + "resolved": "https://registry.npmjs.org/yallist/-/yallist-3.1.1.tgz", + "integrity": "sha512-a4UGQaWPH59mOXUYnAG2ewncQS4i4F43Tv3JoAM+s2VDAmS9NsK8GpDMLrCHPksFT7h3K6TOoUNn2pb7RoXx4g==", + "dev": true, + "license": "ISC" + }, + "node_modules/yocto-queue": { + "version": "0.1.0", + "resolved": "https://registry.npmjs.org/yocto-queue/-/yocto-queue-0.1.0.tgz", + "integrity": "sha512-rVksvsnNCdJ/ohGc6xgPwyN8eheCxsiLM8mxuE/t/mOVqJewPuO1miLpTHQiRgTKCLexL4MeAFVagts7HmNZ2Q==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">=10" + }, + "funding": { + "url": "https://github.com/sponsors/sindresorhus" + } + }, + "node_modules/zwitch": { + "version": "2.0.4", + "resolved": "https://registry.npmjs.org/zwitch/-/zwitch-2.0.4.tgz", + "integrity": "sha512-bXE4cR/kVZhKZX/RjPEflHaKVhUVl85noU3v6b8apfQEc1x4A+zBxjZ4lN8LqGd6WZ3dl98pY4o717VFmoPp+A==", + "license": "MIT", + "funding": { + "type": "github", + "url": "https://github.com/sponsors/wooorm" + } + } + } +} diff --git a/submissions/team_2/chatbot/frontend/package.json b/submissions/team_2/chatbot/frontend/package.json new file mode 100644 index 0000000..99cbc0e --- /dev/null +++ b/submissions/team_2/chatbot/frontend/package.json @@ -0,0 +1,30 @@ +{ + "name": "metakgp-chatbot-frontend", + "version": "1.0.0", + "type": "module", + "description": "MetaKGP Chatbot Frontend - React + Vite + TailwindCSS", + "scripts": { + "dev": "vite", + "build": "vite build -p 3000", + "preview": "vite preview", + "lint": "eslint . --ext js,jsx --report-unused-disable-directives --max-warnings 0" + }, + "dependencies": { + "react": "^18.3.1", + "react-dom": "^18.3.1", + "react-markdown": "^10.1.0", + "tailwindcss": "^4.1.18" + }, + "devDependencies": { + "@tailwindcss/vite": "^4.1.18", + "@types/react": "^18.3.18", + "@types/react-dom": "^18.3.5", + "@vitejs/plugin-react": "^4.3.4", + "autoprefixer": "^10.4.20", + "eslint": "^9.18.0", + "eslint-plugin-react": "^7.37.2", + "eslint-plugin-react-hooks": "^5.1.0", + "eslint-plugin-react-refresh": "^0.4.16", + "vite": "^6.0.7" + } +} diff --git a/submissions/team_2/chatbot/frontend/postcss.config.js b/submissions/team_2/chatbot/frontend/postcss.config.js new file mode 100644 index 0000000..b6dc034 --- /dev/null +++ b/submissions/team_2/chatbot/frontend/postcss.config.js @@ -0,0 +1,5 @@ +export default { + plugins: { + autoprefixer: {}, + }, +} diff --git a/submissions/team_2/chatbot/frontend/src/App.jsx b/submissions/team_2/chatbot/frontend/src/App.jsx new file mode 100644 index 0000000..26553d0 --- /dev/null +++ b/submissions/team_2/chatbot/frontend/src/App.jsx @@ -0,0 +1,260 @@ +import { useState, useRef, useEffect } from 'react' +import ReactMarkdown from 'react-markdown' + +function App() { + const [messages, setMessages] = useState([]) + const [inputValue, setInputValue] = useState('') + const [isLoading, setIsLoading] = useState(false) + const messagesEndRef = useRef(null) + + const scrollToBottom = () => { + messagesEndRef.current?.scrollIntoView({ behavior: 'smooth' }) + } + + useEffect(() => { + scrollToBottom() + }, [messages]) + + const examplePrompts = [ + 'Tell me about Spring Fest', + 'Main Building History', + "Explain 'Hall Tempo'", + 'Life at Nehru Museum' + ] + + const handleSendMessage = async (messageText) => { + const query = messageText || inputValue + if (!query.trim()) return + + // Add user message to chat + const userMessage = { + type: 'user', + content: query + } + setMessages(prev => [...prev, userMessage]) + setInputValue('') + setIsLoading(true) + + try { + const response = await fetch('http://localhost:8000/got/query', { + method: 'POST', + headers: { + 'accept': 'application/json', + 'Content-Type': 'application/json' + }, + body: JSON.stringify({ query }) + }) + + const data = await response.json() + + // Add bot response to chat + const botMessage = { + type: 'bot', + content: data.answer || data.error || 'No response received', + sources: data.sources || [], + chunksRetrieved: data.chunks_retrieved, + error: data.error + } + setMessages(prev => [...prev, botMessage]) + } catch (error) { + const errorMessage = { + type: 'bot', + content: `Error: ${error.message}`, + isError: true + } + setMessages(prev => [...prev, errorMessage]) + } finally { + setIsLoading(false) + } + } + + const handleKeyPress = (e) => { + if (e.key === 'Enter' && !e.shiftKey) { + e.preventDefault() + handleSendMessage() + } + } + + return ( +
+
+
+
+ + + + + +
+ GraphMind +
+
IIT KHARAGPUR
+
+ +
+ {messages.length === 0 ? ( +
+
+ + + + +
+

How can I help you today?

+

+ I'm GraphMind, your intelligent companion for everything IIT Kharagpur. Ask me about academics, culture, or campus history. +

+
+ {examplePrompts.map((prompt, index) => ( + + ))} +
+
+ ) : ( +
+
+ {messages.map((message, index) => ( +
+ {message.type === 'bot' && ( +
+ + + + +
+ )} +
+ {message.type === 'bot' && ( +
+ GraphMind +
+ )} + {message.type === 'user' && ( +
+ YOU +
+ )} +
+ {message.type === 'bot' ? ( +

{children}

, + ul: ({children}) =>
    {children}
, + ol: ({children}) =>
    {children}
, + li: ({children}) =>
  • {children}
  • , + code: ({inline, children}) => + inline ? {children} + : {children}, + strong: ({children}) => {children}, + a: ({href, children}) => {children}, + h1: ({children}) =>

    {children}

    , + h2: ({children}) =>

    {children}

    , + h3: ({children}) =>

    {children}

    , + }} + > + {message.content} +
    + ) : ( +
    {message.content}
    + )} +
    + {message.type === 'bot' && ( +
    + {message.sources && message.sources.length > 0 && ( +
    + Sources: + {message.sources.map((source, idx) => { + const wikiUrl = `https://wiki.metakgp.org/w/${source.replace(/ /g, '_')}`; + return ( + + + + + + {source} + {/* Tooltip */} +
    +
    {source}
    +
    Click to view on MetaKGP Wiki β†’
    + {/* Arrow */} +
    +
    +
    +
    +
    + ); + })} +
    + )} +
    + )} +
    +
    + ))} + {isLoading && ( +
    +
    + + + + +
    +
    +
    + + + +
    +
    +
    + )} +
    +
    +
    + )} +
    + +
    +
    + setInputValue(e.target.value)} + onKeyPress={handleKeyPress} + disabled={isLoading} + /> + +
    +
    + YOGAH KARMASU KAUSHALAM +
    +
    +
    + ) +} + +export default App diff --git a/submissions/team_2/chatbot/frontend/src/index.css b/submissions/team_2/chatbot/frontend/src/index.css new file mode 100644 index 0000000..a5d4f9b --- /dev/null +++ b/submissions/team_2/chatbot/frontend/src/index.css @@ -0,0 +1,13 @@ +@import "tailwindcss"; + +@theme { + --color-graphmind-blue: #3B82F6; + --color-graphmind-dark: #0a0a0a; + --color-graphmind-card: #1a1a2e; +} + +@layer base { + body { + @apply bg-graphmind-dark text-white; + } +} diff --git a/submissions/team_2/chatbot/frontend/src/main.jsx b/submissions/team_2/chatbot/frontend/src/main.jsx new file mode 100644 index 0000000..54b39dd --- /dev/null +++ b/submissions/team_2/chatbot/frontend/src/main.jsx @@ -0,0 +1,10 @@ +import React from 'react' +import ReactDOM from 'react-dom/client' +import App from './App.jsx' +import './index.css' + +ReactDOM.createRoot(document.getElementById('root')).render( + + + , +) diff --git a/submissions/team_2/chatbot/frontend/tailwind.config.js b/submissions/team_2/chatbot/frontend/tailwind.config.js new file mode 100644 index 0000000..bd8131e --- /dev/null +++ b/submissions/team_2/chatbot/frontend/tailwind.config.js @@ -0,0 +1,7 @@ +/** @type {import('tailwindcss').Config} */ +export default { + content: [ + "./index.html", + "./src/**/*.{js,ts,jsx,tsx}", + ], +} diff --git a/submissions/team_2/chatbot/frontend/vite.config.js b/submissions/team_2/chatbot/frontend/vite.config.js new file mode 100644 index 0000000..63e7c99 --- /dev/null +++ b/submissions/team_2/chatbot/frontend/vite.config.js @@ -0,0 +1,10 @@ +import { defineConfig } from 'vite' +import react from '@vitejs/plugin-react' +import tailwindcss from '@tailwindcss/vite' + +export default defineConfig({ + plugins: [react(), tailwindcss(),], + server: { + port: 3000 + } +}) diff --git a/submissions/team_2/scraper/.gitignore b/submissions/team_2/scraper/.gitignore new file mode 100644 index 0000000..cd5320d --- /dev/null +++ b/submissions/team_2/scraper/.gitignore @@ -0,0 +1,37 @@ +# Python +__pycache__/ +*.py[cod] +*$py.class +*.so +.Python +*.egg-info/ +dist/ +build/ + +# Virtual Environment +venv/ +env/ +ENV/ + +# Scraped data +scraped_data/ +*.txt +!requirements.txt + +# IDE +.vscode/ +.idea/ +*.swp +*.swo +*~\ + +*.json + +# OS +.DS_Store +Thumbs.db + +results/ + +*.env* +!.env.example \ No newline at end of file diff --git a/submissions/team_2/scraper/README.md b/submissions/team_2/scraper/README.md new file mode 100644 index 0000000..e4e8d92 --- /dev/null +++ b/submissions/team_2/scraper/README.md @@ -0,0 +1,377 @@ +# MetaKGP Wiki Scraper + +A robust, concurrent wiki scraper for MetaKGP (https://wiki.metakgp.org) with batch processing, multi-threading, Wikitext cleaning, and organized output. + +## Features + +- **Concurrent scraping** with 4-20 threads for parallel processing +- **Wikitext cleaning** - automatic conversion to clean Markdown format +- **Infobox extraction** - converts infoboxes to readable summaries +- **Link cleaning** - converts wiki links to plain text +- **Database upload** - PostgreSQL integration with --database flag +- **Batch file storage** - save pages in separate files for better organization +- **Flexible control** - scrape specific ranges, limits, or the entire wiki +- **Multiple formats** - JSON (structured), text (raw), and Markdown (cleaned) +- **Complete page list** - all pages already fetched +- **Easy to use** - simple Python commands +- **Clean structure** - organized src/ and results/ folders + +## Project Structure + +``` +testing/ +β”œβ”€β”€ src/ # Source code +β”‚ β”œβ”€β”€ main.py # Main parallel scraper +β”‚ β”œβ”€β”€ fetch_all_links.py # Fetch all page links +β”‚ └── wikitext_cleaner.py # Clean wikitext to Markdown +β”œβ”€β”€ results/ # All data outputs +β”‚ β”œβ”€β”€ all_pages.json # List of all pages +β”‚ β”œβ”€β”€ all_pages.txt # Text version +β”‚ └── scraped_data/ # Scraped page content +β”‚ β”œβ”€β”€ scraped_pages.json # JSON with both raw & cleaned text +β”‚ └── ... +β”œβ”€β”€ venv/ # Python virtual environment +└── README.md # This file +``` + +## Quick Start + +### 1. Activate Environment + +```bash +# Activate virtual environment (from project root) +source venv/bin/activate +``` + +### 2. Get All Pages +```bash +# Fetch All Pages +python src/fetch_all_links.py +``` + +### 3. Scrape Pages + +```bash +# Quick sample (10 pages - creates JSON with both raw and cleaned text) +python src/main.py results/all_pages.json --limit 10 + +# Medium batch (100 pages in batches of 25) +python src/main.py results/all_pages.json --limit 100 --pages 25 --threads 8 + +# Full wiki (all 3,583 pages in batches of 50) +python src/main.py results/all_pages.json --pages 50 --threads 4 + +# With database upload (requires .env file at submissions/team_2/.env) +python src/main.py results/all_pages.json --limit 10 --database + +# Optional: Also export raw wikitext to separate text file +python src/main.py results/all_pages.json --limit 10 --text +``` + +**Note:** The `--database` flag requires database credentials in the unified `.env` file at `submissions/team_2/.env` (see main README for setup). + +### 4. Check Results + +```bash +# View JSON structure with both raw and cleaned text +cat results/scraped_data/scraped_pages.json | jq '.pages[0]' + +# View just the cleaned text from JSON +python -c "import json; print(json.load(open('results/scraped_data/scraped_pages.json'))['pages'][0]['cleaned_text'])" +``` + +## Wikitext Cleaning Features + +The scraper automatically cleans Wikitext into human-readable Markdown format: + +### Features + +1. **Infobox Extraction** + - Converts `{{Infobox ...}}` templates to readable summary paragraphs + - Places summary at the start of the document + - Example: `**Summary:** name: IIT Kharagpur; established: 1951; type: Public` + +2. **Header Conversion** + - `==Introduction==` β†’ `## Introduction` + - `===Subsection===` β†’ `### Subsection` + - Maintains proper Markdown header hierarchy + +3. **Link Cleaning** + - `[[Target|Display]]` β†’ `Display` + - `[[Target]]` β†’ `Target` + - Removes wiki markup while preserving text + +4. **Template Removal** + - Removes citation templates + - Cleans up navigation boxes + - Preserves important content + +5. **Additional Cleanup** + - Removes HTML comments + - Cleans up references + - Normalizes whitespace + +### Example Transformation + +**Before (Wikitext):** +```wikitext +{{Infobox university +| name = IIT Kharagpur +| established = 1951 +}} + +==Introduction== +The [[Indian Institute of Technology Kharagpur|IIT Kharagpur]] is located in [[Kharagpur]]. + +===History=== +It was established in 1951. +``` + +**After (Cleaned Markdown):** +```markdown +**Summary:** name: IIT Kharagpur; established: 1951. + +## Introduction +The IIT Kharagpur is located in Kharagpur. + +### History +It was established in 1951. +``` + +## Usage Examples + +### Basic Usage + +```bash +# Scrape 10 pages (creates JSON with both raw and cleaned text) +python src/main.py results/all_pages.json --limit 10 + +# Also export raw wikitext to separate .txt file +python src/main.py results/all_pages.json --limit 10 --text +``` + +### Batch Processing + +```bash +# Scrape 100 pages in batches of 20 (creates 5 files) +python src/main.py results/all_pages.json --limit 100 --pages 20 + +# Each batch file contains max 20 pages: +# - results/scraped_data/scraped_pages_batch1.json +# - results/scraped_data/scraped_pages_batch2.json +# - results/scraped_data/scraped_pages_batch3.json +# - results/scraped_data/scraped_pages_batch4.json +# - results/scraped_data/scraped_pages_batch5.json +``` + +### Advanced Options + +```bash +# Increase threads for faster scraping +python src/main.py results/all_pages.json --limit 100 --threads 8 + +# Skip first 50 pages, scrape next 50 +python src/main.py results/all_pages.json --start 50 --limit 50 + +# Scrape ALL pages in batches of 100 with text output +python src/main.py results/all_pages.json --pages 100 --threads 8 --text +``` + +## Command-Line Options + +### main.py + +``` +Required: + input_file JSON file with list of pages (use results/all_pages.json) + +Optional: + --limit N Total number of pages to scrape (default: all) + --pages N Batch size - max pages per output file (default: all in one file) + --threads N Number of concurrent threads (default: 4, max: 20) + --start N Starting index in page list (default: 0) + --text Export raw wikitext to separate .txt file + --database Upload scraped pages to PostgreSQL database +``` + +**Note:** The scraper automatically cleans all pages. By default, it creates only a JSON file containing both `text` (raw wikitext) and `cleaned_text` (cleaned markdown) fields. Use `--text` flag to also export a separate raw wikitext file. Use `--database` flag to upload to PostgreSQL (requires `.env` file in team_2 folder). + +### fetch_all_links.py + +``` +Optional: + --output FILE Output filename (default: all_pages) + --no-text Don't save text version +``` + +## Performance + +- **Speed:** ~0.36-0.40 seconds per page with 4 threads +- **Full wiki:** ~20-30 minutes for all 3,583 pages +- **Memory:** Minimal - each batch saved immediately + +## Common Workflows + +### 1. First Time Setup + +```bash +source venv/bin/activate +python src/fetch_all_links.py # Fetches all page titles +``` + +### 2. Sample Testing + +```bash +# Test with 10 pages first +python src/main.py results/all_pages.json --limit 10 + +# Check output +cat results/scraped_data/scraped_pages.json | jq '.pages | length' +``` + +### 3. Production Scraping + +```bash +# Scrape entire wiki in manageable batches +python src/main.py results/all_pages.json --pages 50 --threads 8 --text + +# Both JSON and text formats for easy viewing +``` + +### 4. Specific Range + +```bash +# Scrape pages 1000-1100 in batches of 25 +python src/main.py results/all_pages.json --start 1000 --limit 100 --pages 25 +``` + +## All Available Scripts + +### Concurrent Scraper (Main Tool) + +```bash +python src/main.py results/all_pages.json [OPTIONS] +``` + +### Fetch All Page Links + +```bash +python src/fetch_all_links.py [OPTIONS] +``` + +## Output Format + +### JSON Structure + +Each scraped page now includes both raw wikitext and cleaned Markdown: + +```json +{ + "total_scraped": 10, + "scraped_at": "2026-01-13 10:30:00", + "pages": [ + { + "name": "Page Title", + "title": "Page Title", + "text": "{{Infobox...}}\n==Introduction==\nRaw wikitext...", + "cleaned_text": "**Summary:** ...\n\n## Introduction\nCleaned markdown...", + "exists": true, + "redirect": false, + "revision": 12345, + "categories": ["Category1", "Category2"], + "links": ["Link1", "Link2"] + } + ] +} +``` + +### Cleaned Markdown File Format + +```markdown +================================================================================ +# Page Title +================================================================================ + +**Summary:** key1: value1; key2: value2. + +## Introduction +The page content in clean Markdown format... + +### Subsection +More content with proper links and formatting. +``` + +### Text Format (Raw Wikitext) + +``` +================================================================================ +Page: Page Title +================================================================================ + +{{Infobox university +| name = Page Title +}} + +==Introduction== +The '''Page''' with [[links]] and templates. +``` + +## Dependencies + +- Python 3.13+ +- mwclient 0.11.0 - MediaWiki API client +- mwparserfromhell 0.7.2 - Wikitext parser +- beautifulsoup4 4.14.3 - HTML parsing +- requests 2.32.5 - HTTP library +- psycopg2-binary 2.9.11 - PostgreSQL adapter +- python-dotenv 1.2.1 - Environment variables + +All dependencies are installed via: + +```bash +source venv/bin/activate +pip install -r requirements.txt +``` + +## Additional Documentation + +- `DATABASE_UPLOAD.md` - Database integration guide +- `WIKITEXT_CLEANING.md` - Wikitext cleaning documentation +- `QUICK_START.md` - Quick reference guide +- `SCRAPER_README.md` - Detailed technical documentation +- `UPDATE_NOTES.md` - Flag system changes +- `BATCH_FILES_FEATURE.md` - Batch processing details + +## Current Status + +- All 3,583 page links fetched and saved +- Concurrent scraper tested and working +- Batch processing tested with 5, 10, 25 pages per batch +- Performance validated: ~0.36-0.40 seconds/page +- Full project structure organized + +## Next Steps + +1. **Activate environment:** + ```bash + source venv/bin/activate + ``` + +2. **Run a quick test:** + ```bash + python src/main.py results/all_pages.json --limit 10 + ``` + +3. **Check the output:** + ```bash + cat results/scraped_data/scraped_pages.json + ``` + +4. **Run full scrape:** + ```bash + python src/main.py results/all_pages.json --pages 50 --threads 8 + ``` + +## License + +This is a scraper tool for MetaKGP Wiki. Please respect the wiki's terms of service and rate limits. diff --git a/submissions/team_2/scraper/requirements.txt b/submissions/team_2/scraper/requirements.txt new file mode 100644 index 0000000..47fba47 --- /dev/null +++ b/submissions/team_2/scraper/requirements.txt @@ -0,0 +1,6 @@ +mwclient>=0.10.1 +beautifulsoup4>=4.9.0 +requests>=2.25.0 +mwparserfromhell>=0.6.4 +psycopg2-binary>=2.9.0 +python-dotenv>=1.0.0 diff --git a/submissions/team_2/scraper/src/database_uploader.py b/submissions/team_2/scraper/src/database_uploader.py new file mode 100644 index 0000000..7ace7a0 --- /dev/null +++ b/submissions/team_2/scraper/src/database_uploader.py @@ -0,0 +1,270 @@ +#!/usr/bin/env python3 +""" +Database Uploader Module +Handles uploading scraped wiki pages to PostgreSQL database +""" + +import psycopg2 +from psycopg2.extras import execute_batch +from typing import List, Dict, Optional +import os +from pathlib import Path +from dotenv import load_dotenv + + +class DatabaseUploader: + """Handles database operations for wiki pages""" + + def __init__(self, env_path: Optional[str] = None): + """ + Initialize database connection + + Args: + env_path: Path to .env file (default: looks in team_2 root) + """ + # Load environment variables + if env_path: + load_dotenv(env_path) + else: + # Try to find .env in team_2 root directory + current_dir = Path(__file__).resolve().parent + team_2_root = current_dir.parent.parent + env_file = team_2_root / '.env' + if env_file.exists(): + load_dotenv(env_file) + else: + raise FileNotFoundError( + f"No .env file found at {env_file}. " + "Please create one with database credentials." + ) + + # Get database credentials from environment + self.db_config = { + 'host': os.getenv('DB_HOST', 'localhost'), + 'port': os.getenv('DB_PORT', '5432'), + 'database': os.getenv('DB_NAME'), + 'user': os.getenv('DB_USER'), + 'password': os.getenv('DB_PASSWORD') + } + + # Validate required fields + if not all([self.db_config['database'], self.db_config['user'], self.db_config['password']]): + raise ValueError( + "Missing required database credentials in .env file. " + "Required: DB_NAME, DB_USER, DB_PASSWORD" + ) + + self.conn = None + self.cursor = None + + def connect(self): + """Establish database connection""" + try: + self.conn = psycopg2.connect(**self.db_config) + self.cursor = self.conn.cursor() + print(f"βœ“ Connected to database: {self.db_config['database']}") + except psycopg2.Error as e: + print(f"βœ— Database connection failed: {e}") + raise + + def disconnect(self): + """Close database connection""" + if self.cursor: + self.cursor.close() + if self.conn: + self.conn.close() + print("βœ“ Database connection closed") + + def upsert_page(self, page_data: Dict) -> bool: + """ + Insert or update a single page + + Args: + page_data: Dictionary containing page data + + Returns: + True if successful, False otherwise + """ + upsert_sql = """ + INSERT INTO metakgp_pages ( + name, title, text, cleaned_text, exists, redirect, + revision, categories, links + ) VALUES ( + %s, %s, %s, %s, %s, %s, %s, %s, %s + ) + ON CONFLICT (name) + DO UPDATE SET + title = EXCLUDED.title, + text = EXCLUDED.text, + cleaned_text = EXCLUDED.cleaned_text, + exists = EXCLUDED.exists, + redirect = EXCLUDED.redirect, + revision = EXCLUDED.revision, + categories = EXCLUDED.categories, + links = EXCLUDED.links, + updated_at = CURRENT_TIMESTAMP + """ + + try: + self.cursor.execute(upsert_sql, ( + page_data['name'], + page_data['title'], + page_data['text'], + page_data.get('cleaned_text', ''), + page_data.get('exists', True), + page_data.get('redirect', False), + page_data.get('revision'), + page_data.get('categories', []), + page_data.get('links', []) + )) + return True + except psycopg2.Error as e: + print(f"βœ— Error upserting page '{page_data.get('name', 'unknown')}': {e}") + return False + + def upsert_pages_batch(self, pages: List[Dict], batch_size: int = 500) -> tuple: + """ + Insert or update multiple pages in batches + + Args: + pages: List of page data dictionaries + batch_size: Number of records to insert per batch (default: 500) + + Returns: + Tuple of (successful_count, failed_count) + """ + upsert_sql = """ + INSERT INTO metakgp_pages ( + name, title, text, cleaned_text, exists, redirect, + revision, categories, links + ) VALUES ( + %s, %s, %s, %s, %s, %s, %s, %s, %s + ) + ON CONFLICT (name) + DO UPDATE SET + title = EXCLUDED.title, + text = EXCLUDED.text, + cleaned_text = EXCLUDED.cleaned_text, + exists = EXCLUDED.exists, + redirect = EXCLUDED.redirect, + revision = EXCLUDED.revision, + categories = EXCLUDED.categories, + links = EXCLUDED.links, + updated_at = CURRENT_TIMESTAMP + """ + + successful = 0 + failed = 0 + total = len(pages) + + print(f"\nUploading {total} pages to database...") + + # Process in batches + for i in range(0, total, batch_size): + batch = pages[i:i + batch_size] + batch_data = [] + + for page in batch: + batch_data.append(( + page['name'], + page['title'], + page['text'], + page.get('cleaned_text', ''), + page.get('exists', True), + page.get('redirect', False), + page.get('revision'), + page.get('categories', []), + page.get('links', []) + )) + + try: + execute_batch(self.cursor, upsert_sql, batch_data) + self.conn.commit() + successful += len(batch) + print(f" [{successful}/{total}] Uploaded batch {i//batch_size + 1}") + except psycopg2.Error as e: + self.conn.rollback() + print(f" βœ— Batch {i//batch_size + 1} failed: {e}") + failed += len(batch) + + return successful, failed + + def upload_pages(self, pages: List[Dict]) -> tuple: + """ + Main method to upload pages to database + + Args: + pages: List of page data dictionaries + + Returns: + Tuple of (successful_count, failed_count) + """ + if not pages: + print("No pages to upload") + return 0, 0 + + print(f"\n{'='*70}") + print(f"Starting database upload") + print(f"{'='*70}") + + try: + self.connect() + + successful, failed = self.upsert_pages_batch(pages) + + print(f"\n{'='*70}") + print(f"Database Upload Complete!") + print(f"{'='*70}") + print(f"Successfully uploaded: {successful}") + print(f"Failed: {failed}") + print(f"{'='*70}\n") + + return successful, failed + + except Exception as e: + print(f"\nβœ— Database upload error: {e}") + return 0, len(pages) + finally: + self.disconnect() + + def get_page_count(self) -> int: + """Get total number of pages in database""" + try: + self.cursor.execute("SELECT COUNT(*) FROM metakgp_pages") + count = self.cursor.fetchone()[0] + return count + except psycopg2.Error as e: + print(f"βœ— Error getting page count: {e}") + return 0 + + def get_latest_revision(self, page_name: str) -> Optional[int]: + """Get the latest revision number for a page""" + try: + self.cursor.execute( + "SELECT revision FROM metakgp_pages WHERE name = %s", + (page_name,) + ) + result = self.cursor.fetchone() + return result[0] if result else None + except psycopg2.Error as e: + print(f"βœ— Error getting revision for '{page_name}': {e}") + return None + + +def test_connection(env_path: Optional[str] = None): + """Test database connection""" + try: + uploader = DatabaseUploader(env_path) + uploader.connect() + print("βœ“ Database connection test successful!") + uploader.disconnect() + return True + except Exception as e: + print(f"βœ— Database connection test failed: {e}") + return False + + +if __name__ == "__main__": + # Test the database connection + print("Testing database connection...") + test_connection() diff --git a/submissions/team_2/scraper/src/fetch_all_links.py b/submissions/team_2/scraper/src/fetch_all_links.py new file mode 100644 index 0000000..9a12dff --- /dev/null +++ b/submissions/team_2/scraper/src/fetch_all_links.py @@ -0,0 +1,182 @@ +#!/usr/bin/env python3 +""" +Fetch all page links from MetaKGP Wiki's Special:AllPages +Saves the complete list of page titles to a JSON file +Default output location: ./results/ +""" + +import requests +from bs4 import BeautifulSoup +import json +import time +import re +from pathlib import Path +from urllib.parse import urljoin, urlparse, parse_qs +from typing import List, Optional + + +class AllPagesFetcher: + """Fetches all page links from Special:AllPages""" + + def __init__(self, base_url: str = "https://wiki.metakgp.org"): + self.base_url = base_url + self.all_pages_url = f"{base_url}/w/Special:AllPages" + self.session = requests.Session() + self.session.headers.update({ + 'User-Agent': 'MetaKGP Wiki Scraper Bot/1.0' + }) + + def extract_page_links(self, html: str) -> List[str]: + """Extract page titles from the AllPages HTML""" + soup = BeautifulSoup(html, 'html.parser') + page_titles = [] + + # Find all page links in the mw-allpages-chunk list + chunk_list = soup.find('ul', class_='mw-allpages-chunk') + if chunk_list: + for link in chunk_list.find_all('a'): + title = link.get('title') + if title: + page_titles.append(title) + + return page_titles + + def extract_next_page_link(self, html: str) -> Optional[str]: + """Extract the 'Next page' link from the AllPages navigation""" + soup = BeautifulSoup(html, 'html.parser') + + # Find navigation div + nav_div = soup.find('div', class_='mw-allpages-nav') + if nav_div: + next_link = nav_div.find('a', string=re.compile(r'Next page')) + if next_link: + href = next_link.get('href') + if href: + return urljoin(self.base_url, href) + + return None + + def fetch_all_pages(self) -> List[str]: + """ + Fetch all page titles by following 'Next page' links + Returns a list of all page titles + """ + all_titles = [] + current_url = self.all_pages_url + page_count = 0 + + print("Starting to fetch all page links from Special:AllPages...") + print(f"Initial URL: {current_url}\n") + + while current_url: + page_count += 1 + print(f"Fetching page {page_count}... ", end='', flush=True) + + try: + response = self.session.get(current_url, timeout=30) + response.raise_for_status() + + # Extract page titles from current page + titles = self.extract_page_links(response.text) + all_titles.extend(titles) + print(f"Found {len(titles)} pages (Total: {len(all_titles)})") + + # Find next page link + next_url = self.extract_next_page_link(response.text) + + if next_url: + # Extract the 'from' parameter to show progress + parsed = urlparse(next_url) + params = parse_qs(parsed.query) + from_param = params.get('from', [''])[0] + if from_param: + print(f" β†’ Next starting from: {from_param}") + current_url = next_url + time.sleep(0.5) # Be nice to the server + else: + print("\nβœ“ Reached the end of all pages!") + current_url = None + + except requests.RequestException as e: + print(f"\nβœ— Error fetching {current_url}: {e}") + break + + print(f"\n{'='*70}") + print(f"Total pages fetched: {len(all_titles)}") + print(f"Total AllPages pagination pages visited: {page_count}") + print(f"{'='*70}") + + return all_titles + + def save_to_json(self, titles: List[str], filename: str = "all_pages.json", output_dir: str = "./results"): + """Save page titles to JSON file + + Args: + titles: List of page titles + filename: Output filename + output_dir: Output directory path (default: ./results) + """ + # Create output directory if it doesn't exist + output_path = Path(output_dir) + output_path.mkdir(parents=True, exist_ok=True) + full_path = output_path / filename + + data = { + "total_pages": len(titles), + "fetched_at": time.strftime("%Y-%m-%d %H:%M:%S"), + "pages": titles + } + + with open(full_path, 'w', encoding='utf-8') as f: + json.dump(data, f, indent=2, ensure_ascii=False) + + print(f"\nβœ“ Saved {len(titles)} page titles to {full_path}") + + def save_to_text(self, titles: List[str], filename: str = "all_pages.txt", output_dir: str = "./results"): + """Save page titles to text file (one per line) + + Args: + titles: List of page titles + filename: Output filename + output_dir: Output directory path (default: ./results) + """ + # Create output directory if it doesn't exist + output_path = Path(output_dir) + output_path.mkdir(parents=True, exist_ok=True) + full_path = output_path / filename + + with open(full_path, 'w', encoding='utf-8') as f: + for title in titles: + f.write(f"{title}\n") + + print(f"βœ“ Saved {len(titles)} page titles to {full_path}") + + +def main(): + """Main function - saves to ./results directory by default""" + fetcher = AllPagesFetcher() + + # Fetch all page links + all_titles = fetcher.fetch_all_pages() + + if all_titles: + # Save to ./results directory + fetcher.save_to_json(all_titles) + fetcher.save_to_text(all_titles) + + # Show some sample titles + print("\n" + "="*70) + print("Sample page titles (first 10):") + print("="*70) + for i, title in enumerate(all_titles[:10], 1): + print(f"{i}. {title}") + + if len(all_titles) > 10: + print("\n...") + print("\nLast 5 page titles:") + for i, title in enumerate(all_titles[-5:], len(all_titles) - 4): + print(f"{i}. {title}") + + +if __name__ == "__main__": + main() diff --git a/submissions/team_2/scraper/src/main.py b/submissions/team_2/scraper/src/main.py new file mode 100644 index 0000000..6ae551f --- /dev/null +++ b/submissions/team_2/scraper/src/main.py @@ -0,0 +1,472 @@ +#!/usr/bin/env python3 +""" +Concurrent Wiki Page Scraper +Reads page links from JSON file and fetches multiple pages in parallel using 4 threads +""" + +import json +import argparse +import mwclient +import time +from pathlib import Path +from typing import List, Dict, Optional +from concurrent.futures import ThreadPoolExecutor, as_completed +from threading import Lock +import sys +from wikitext_cleaner import WikitextCleaner +from database_uploader import DatabaseUploader + + +class ConcurrentWikiScraper: + """Scrapes multiple wiki pages concurrently using thread pool""" + + def __init__(self, max_workers: int = 4, output_dir: str = "results/scraped_data"): + """ + Initialize concurrent scraper + + Args: + max_workers: Number of parallel threads (default: 4) + output_dir: Directory to save scraped data (relative to current working directory) + """ + self.max_workers = max_workers + self.output_dir = Path(output_dir) + self.output_dir.mkdir(parents=True, exist_ok=True) + + # Thread-safe counter and lock for progress tracking + self.completed = 0 + self.failed = 0 + self.lock = Lock() + + # Initialize wiki connection (thread-safe) + self.site = mwclient.Site('wiki.metakgp.org', path='/') + + # Initialize wikitext cleaner + self.cleaner = WikitextCleaner() + + def fetch_page(self, page_name: str, index: int, total: int) -> Optional[Dict]: + """ + Fetch a single page (thread-safe) + + Args: + page_name: Name of the wiki page + index: Current page index + total: Total pages to fetch + + Returns: + Dictionary containing page data or None if failed + """ + try: + page = self.site.pages[page_name] + + if not page.exists: + with self.lock: + self.failed += 1 + print(f"[{index}/{total}] βœ— Page '{page_name}' does not exist") + return None + + # Fetch raw wikitext + raw_text = page.text() + + # Clean the wikitext to readable Markdown + cleaned_text = self.cleaner.clean_wikitext(raw_text) + + page_data = { + 'name': page.name, + 'title': page.page_title, + 'text': raw_text, # Keep original for reference + 'cleaned_text': cleaned_text, # Add cleaned version + 'exists': page.exists, + 'redirect': page.redirect, + 'revision': page.revision, + 'categories': [cat.name for cat in page.categories()], + 'links': [link.name for link in page.links()], + } + + with self.lock: + self.completed += 1 + print(f"[{index}/{total}] βœ“ Scraped: {page_name} (Thread {id(page) % 10000})") + + return page_data + + except Exception as e: + with self.lock: + self.failed += 1 + print(f"[{index}/{total}] βœ— Error fetching '{page_name}': {e}") + return None + + def scrape_pages_concurrent(self, page_names: List[str]) -> List[Dict]: + """ + Scrape multiple pages concurrently using thread pool + + Args: + page_names: List of page names to scrape + + Returns: + List of successfully scraped page data + """ + total = len(page_names) + results = [] + + print(f"\n{'='*70}") + print(f"Starting concurrent scrape of {total} pages using {self.max_workers} threads") + print(f"{'='*70}\n") + + start_time = time.time() + + # Create thread pool and submit all tasks + with ThreadPoolExecutor(max_workers=self.max_workers) as executor: + # Submit all fetch tasks + future_to_page = { + executor.submit(self.fetch_page, page_name, idx + 1, total): page_name + for idx, page_name in enumerate(page_names) + } + + # Collect results as they complete + for future in as_completed(future_to_page): + page_name = future_to_page[future] + try: + page_data = future.result() + if page_data: + results.append(page_data) + except Exception as e: + with self.lock: + self.failed += 1 + print(f"βœ— Exception for '{page_name}': {e}") + + elapsed = time.time() - start_time + + print(f"\n{'='*70}") + print(f"Scraping Complete!") + print(f"{'='*70}") + print(f"Total pages requested: {total}") + print(f"Successfully scraped: {self.completed}") + print(f"Failed: {self.failed}") + print(f"Time taken: {elapsed:.2f} seconds") + print(f"Average: {elapsed/total:.2f} seconds per page") + print(f"{'='*70}\n") + + return results + + def save_to_json(self, data: List[Dict], filename: str): + """Save scraped data to JSON file""" + filepath = self.output_dir / filename + + output = { + 'total_scraped': len(data), + 'scraped_at': time.strftime("%Y-%m-%d %H:%M:%S"), + 'pages': data + } + + with open(filepath, 'w', encoding='utf-8') as f: + json.dump(output, f, indent=2, ensure_ascii=False) + + print(f"βœ“ Saved {len(data)} pages to {filepath}") + + def save_to_text(self, data: List[Dict], filename: str): + """Save scraped data to text file""" + filepath = self.output_dir / filename + + with open(filepath, 'w', encoding='utf-8') as f: + for page in data: + f.write(f"{'='*80}\n") + f.write(f"Page: {page['name']}\n") + f.write(f"{'='*80}\n\n") + f.write(page['text']) + f.write(f"\n\n") + + print(f"βœ“ Saved {len(data)} pages to {filepath}") + + def save_cleaned_to_text(self, data: List[Dict], filename: str): + """Save cleaned text to markdown file""" + filepath = self.output_dir / filename + + with open(filepath, 'w', encoding='utf-8') as f: + for page in data: + f.write(f"{'='*80}\n") + f.write(f"# {page['name']}\n") + f.write(f"{'='*80}\n\n") + if 'cleaned_text' in page: + f.write(page['cleaned_text']) + else: + f.write(page['text']) + f.write(f"\n\n") + + print(f"βœ“ Saved {len(data)} cleaned pages to {filepath}") + + +def load_page_list(json_file: str) -> List[str]: + """Load page list from JSON file""" + try: + with open(json_file, 'r', encoding='utf-8') as f: + data = json.load(f) + + # Support different JSON structures + if isinstance(data, dict): + pages = data.get('pages', []) + elif isinstance(data, list): + pages = data + else: + raise ValueError("Invalid JSON format") + + return pages + + except FileNotFoundError: + print(f"βœ— Error: File '{json_file}' not found") + sys.exit(1) + except json.JSONDecodeError as e: + print(f"βœ— Error: Invalid JSON in '{json_file}': {e}") + sys.exit(1) + except Exception as e: + print(f"βœ— Error loading '{json_file}': {e}") + sys.exit(1) + + +def main(): + """Main function with argument parsing""" + parser = argparse.ArgumentParser( + description="Concurrent Wiki Page Scraper - Fetch multiple pages in parallel", + formatter_class=argparse.RawDescriptionHelpFormatter, + epilog=""" +Examples: + # Scrape first 10 pages (total) using 4 threads (default) + python src/main.py results/all_pages.json --limit 10 + + # Scrape 100 pages (total) in batches of 20, using 8 threads per batch + python src/main.py results/all_pages.json --limit 100 --pages 20 --threads 8 + + # Scrape all pages from the JSON file with default settings + python src/main.py results/all_pages.json + + # Scrape all 3583 pages in batches of 50, using 4 threads per batch + python src/main.py results/all_pages.json --pages 50 --threads 4 + + # Save output with custom filename and also as text + python src/main.py results/all_pages.json --limit 20 --output my_pages.json --text + + # Scrape 200 pages starting from index 100 + python src/main.py results/all_pages.json --limit 200 --start 100 + """ + ) + + parser.add_argument( + 'json_file', + help='JSON file containing list of page titles' + ) + + parser.add_argument( + '--pages', + type=int, + default=None, + help='Batch size for parallel processing (default: process all at once)' + ) + + parser.add_argument( + '--limit', + type=int, + default=None, + help='Total number of pages to scrape (default: all pages in file)' + ) + + parser.add_argument( + '--threads', + type=int, + default=4, + help='Number of parallel threads (default: 4)' + ) + + parser.add_argument( + '--output', + type=str, + default='scraped_pages.json', + help='Output JSON filename (default: scraped_pages.json)' + ) + + parser.add_argument( + '--text', + action='store_true', + help='Also save as text file (raw wikitext)' + ) + + parser.add_argument( + '--database', + action='store_true', + help='Upload scraped pages to PostgreSQL database' + ) + + parser.add_argument( + '--start', + type=int, + default=0, + help='Start index in the page list (default: 0)' + ) + + args = parser.parse_args() + + # Validate threads + if args.threads < 1 or args.threads > 20: + print("βœ— Error: Number of threads must be between 1 and 20") + sys.exit(1) + + # Load page list from JSON file + print(f"Loading page list from {args.json_file}...") + all_pages = load_page_list(args.json_file) + print(f"βœ“ Loaded {len(all_pages)} page titles") + + # Determine which pages to scrape based on --limit and --start + start_idx = args.start + + if args.limit: + end_idx = min(start_idx + args.limit, len(all_pages)) + pages_to_scrape = all_pages[start_idx:end_idx] + print(f"βœ“ Will scrape pages {start_idx + 1} to {end_idx} ({len(pages_to_scrape)} pages total)") + else: + pages_to_scrape = all_pages[start_idx:] + print(f"βœ“ Will scrape all pages starting from index {start_idx} ({len(pages_to_scrape)} pages total)") + + if not pages_to_scrape: + print("βœ— No pages to scrape!") + sys.exit(1) + + # Initialize scraper with specified number of threads + scraper = ConcurrentWikiScraper(max_workers=args.threads) + + # If --pages is specified, process in batches; otherwise process all at once + if args.pages and args.pages < len(pages_to_scrape): + print(f"βœ“ Processing in batches of {args.pages} pages with {args.threads} threads per batch") + print(f"βœ“ Each batch will be saved to a separate file\n") + + all_results = [] + total_pages = len(pages_to_scrape) + batch_number = 1 + saved_files = [] + + for batch_start in range(0, total_pages, args.pages): + batch_end = min(batch_start + args.pages, total_pages) + batch = pages_to_scrape[batch_start:batch_end] + + print(f"\n{'='*70}") + print(f"Batch {batch_number}: Processing pages {batch_start + 1} to {batch_end} of {total_pages}") + print(f"{'='*70}") + + # Reset counters for this batch + scraper.completed = 0 + scraper.failed = 0 + + batch_results = scraper.scrape_pages_concurrent(batch) + all_results.extend(batch_results) + + print(f"βœ“ Batch complete: {len(batch_results)} pages scraped") + + # Save this batch to a separate file + if batch_results: + # Generate filename for this batch + base_name = args.output.replace('.json', '') + batch_filename = f"{base_name}_batch{batch_number}.json" + scraper.save_to_json(batch_results, batch_filename) + saved_files.append(batch_filename) + + # Save raw text file if requested + if args.text: + text_filename = f"{base_name}_batch{batch_number}.txt" + scraper.save_to_text(batch_results, text_filename) + saved_files.append(text_filename) + + batch_number += 1 + + # Small delay between batches to be nice to the server + if batch_end < total_pages: + time.sleep(1) + + results = all_results + print(f"\n{'='*70}") + print(f"All Batches Complete!") + print(f"Total pages scraped: {len(results)} out of {total_pages}") + print(f"Total files created: {len(saved_files)}") + print(f"{'='*70}\n") + + # Print list of saved files + print("Saved files:") + for filename in saved_files: + print(f" - scraped_data/{filename}") + print() + + # Upload to database if requested + if args.database and results: + try: + uploader = DatabaseUploader() + successful, failed = uploader.upload_pages(results) + + # Delete JSON files after successful upload + if successful > 0 and failed == 0: + print("\nβœ“ All pages uploaded successfully. Cleaning up JSON files...") + deleted_count = 0 + for filename in saved_files: + filepath = scraper.output_dir / filename + try: + if filepath.exists(): + filepath.unlink() + deleted_count += 1 + print(f" βœ“ Deleted: {filename}") + except Exception as e: + print(f" βœ— Failed to delete {filename}: {e}") + print(f"βœ“ Cleaned up {deleted_count} file(s)") + else: + print(f"\n⚠ Keeping JSON files due to upload errors (failed: {failed})") + except Exception as e: + print(f"βœ— Database upload failed: {e}") + print("⚠ Keeping JSON files due to upload failure") + else: + # Process all pages at once (no batching) + results = scraper.scrape_pages_concurrent(pages_to_scrape) + + # Save results to single file + if results: + scraper.save_to_json(results, args.output) + saved_json_files = [args.output] + + # Save raw text file if requested + if args.text: + text_file = args.output.replace('.json', '.txt') + scraper.save_to_text(results, text_file) + + # Upload to database if requested + if args.database: + try: + uploader = DatabaseUploader() + successful, failed = uploader.upload_pages(results) + + # Delete JSON files after successful upload + if successful > 0 and failed == 0: + print("\nβœ“ All pages uploaded successfully. Cleaning up JSON files...") + deleted_count = 0 + for filename in saved_json_files: + filepath = scraper.output_dir / filename + try: + if filepath.exists(): + filepath.unlink() + deleted_count += 1 + print(f" βœ“ Deleted: {filename}") + except Exception as e: + print(f" βœ— Failed to delete {filename}: {e}") + print(f"βœ“ Cleaned up {deleted_count} file(s)") + else: + print(f"\n⚠ Keeping JSON files due to upload errors (failed: {failed})") + except Exception as e: + print(f"βœ— Database upload failed: {e}") + print("⚠ Keeping JSON files due to upload failure") + + if results: + # Show sample of scraped data + print("\nSample scraped pages (first 3):") + print("="*70) + for i, page in enumerate(results[:3], 1): + print(f"{i}. {page['name']}") + print(f" Categories: {len(page['categories'])}, Links: {len(page['links'])}") + print(f" Content length: {len(page['text'])} characters") + else: + print("\nβœ— No pages were successfully scraped!") + sys.exit(1) + + +if __name__ == "__main__": + main() diff --git a/submissions/team_2/scraper/src/wikitext_cleaner.py b/submissions/team_2/scraper/src/wikitext_cleaner.py new file mode 100644 index 0000000..89516b8 --- /dev/null +++ b/submissions/team_2/scraper/src/wikitext_cleaner.py @@ -0,0 +1,343 @@ +#!/usr/bin/env python3 +""" +Wikitext Cleaner Module +Parses and cleans Wikitext content using mwparserfromhell library +Converts to clean, human-readable Markdown format +""" + +import mwparserfromhell +import re +from typing import Dict, List, Tuple + + +class WikitextCleaner: + """Cleans and converts Wikitext to readable Markdown format""" + + def __init__(self): + """Initialize the cleaner""" + pass + + def extract_infobox_summary(self, wikicode) -> Tuple[str, mwparserfromhell.wikicode.Wikicode]: + """ + Extract Infobox templates and convert to human-readable summary paragraph + + Args: + wikicode: Parsed wikicode object + + Returns: + Tuple of (summary_text, wikicode_without_infobox) + """ + summary_parts = [] + infoboxes_found = [] + + # Find all Infobox templates + for template in wikicode.filter_templates(): + template_name = str(template.name).strip().lower() + + # Check if it's an infobox (various naming conventions) + if 'infobox' in template_name: + infoboxes_found.append(template) + + # Extract key-value pairs from the infobox + infobox_data = [] + for param in template.params: + param_name = str(param.name).strip() + param_value = str(param.value).strip() + + # Skip empty values or common metadata fields + if not param_value or param_name.lower() in ['image', 'caption', 'alt', 'image_size']: + continue + + # Clean the value (remove nested templates, links, etc.) + cleaned_value = self._clean_text(param_value) + + if cleaned_value: + infobox_data.append(f"{param_name}: {cleaned_value}") + + # Create summary paragraph + if infobox_data: + summary_parts.append("**Summary:** " + "; ".join(infobox_data) + ".") + + # Remove infoboxes from wikicode + for infobox in infoboxes_found: + wikicode.remove(infobox) + + summary_text = "\n\n".join(summary_parts) + return summary_text, wikicode + + def convert_headers_to_markdown(self, text: str) -> str: + """ + Convert Wikitext headers to Markdown headers + + Examples: + ==Introduction== -> ## Introduction + ===Subsection=== -> ### Subsection + + Args: + text: Text with Wikitext headers + + Returns: + Text with Markdown headers + """ + # Match headers: ==Header== or ===Header=== etc. + # Count the number of = signs and convert to # signs + def replace_header(match): + equals_count = len(match.group(1)) + header_text = match.group(2).strip() + markdown_level = '#' * equals_count + return f"{markdown_level} {header_text}" + + # Pattern: (==+)(.+?)\1 + # Matches symmetric equals signs around text + text = re.sub(r'^(={2,})(.*?)\1\s*$', replace_header, text, flags=re.MULTILINE) + + return text + + def clean_internal_links(self, wikicode) -> mwparserfromhell.wikicode.Wikicode: + """ + Clean internal wiki links + + Examples: + [[Target|Display]] -> Display + [[Target]] -> Target + + Args: + wikicode: Parsed wikicode object + + Returns: + Cleaned wikicode + """ + for wikilink in wikicode.filter_wikilinks(): + # Get the display text if available, otherwise use the target + if wikilink.text: + display_text = str(wikilink.text) + else: + display_text = str(wikilink.title) + + # Replace the wikilink with plain text + wikicode.replace(wikilink, display_text) + + return wikicode + + def _clean_text(self, text: str) -> str: + """ + Helper method to clean nested wiki markup from text + + Args: + text: Text to clean + + Returns: + Cleaned text + """ + try: + # Parse the text + wikicode = mwparserfromhell.parse(text) + + # Remove templates + for template in wikicode.filter_templates(): + wikicode.remove(template) + + # Clean links + for wikilink in wikicode.filter_wikilinks(): + if wikilink.text: + wikicode.replace(wikilink, str(wikilink.text)) + else: + wikicode.replace(wikilink, str(wikilink.title)) + + # Get plain text + cleaned = wikicode.strip_code() + + # Remove extra whitespace + cleaned = re.sub(r'\s+', ' ', cleaned).strip() + + return cleaned + except: + # Fallback: basic cleaning + cleaned = re.sub(r'\{\{[^}]+\}\}', '', text) # Remove templates + cleaned = re.sub(r'\[\[([^|\]]+\|)?([^\]]+)\]\]', r'\2', cleaned) # Clean links + cleaned = re.sub(r'\s+', ' ', cleaned).strip() + return cleaned + + def remove_templates(self, wikicode, keep_content: bool = True) -> mwparserfromhell.wikicode.Wikicode: + """ + Remove wiki templates (except infoboxes which are handled separately) + + Args: + wikicode: Parsed wikicode object + keep_content: If True, try to preserve readable content from templates + + Returns: + Cleaned wikicode + """ + templates_to_remove = [] + + for template in wikicode.filter_templates(): + template_name = str(template.name).strip().lower() + + # Skip infoboxes (handled separately) + if 'infobox' in template_name: + continue + + templates_to_remove.append(template) + + # Remove templates + for template in templates_to_remove: + if keep_content: + # Try to extract readable content from template + try: + # For cite templates, extract the title or content + if 'cite' in str(template.name).lower(): + for param in template.params: + if 'title' in str(param.name).lower(): + content = str(param.value).strip() + if content: + wikicode.replace(template, content) + continue + wikicode.remove(template) + except: + wikicode.remove(template) + else: + wikicode.remove(template) + + return wikicode + + def clean_wikitext(self, wikitext: str) -> str: + """ + Main cleaning function - converts Wikitext to clean Markdown + + Process: + 1. Parse Wikitext using mwparserfromhell + 2. Extract and convert Infoboxes to summary paragraph + 3. Clean internal links + 4. Remove other templates + 5. Convert headers to Markdown + 6. Final text cleanup + + Args: + wikitext: Raw Wikitext content + + Returns: + Cleaned Markdown text + """ + if not wikitext or not wikitext.strip(): + return "" + + try: + # Parse the wikitext + wikicode = mwparserfromhell.parse(wikitext) + + # Step 1: Extract infobox and create summary + infobox_summary, wikicode = self.extract_infobox_summary(wikicode) + + # Step 2: Clean internal links + wikicode = self.clean_internal_links(wikicode) + + # Step 3: Remove other templates + wikicode = self.remove_templates(wikicode, keep_content=True) + + # Convert to string + text = str(wikicode) + + # Step 4: Convert headers to Markdown + text = self.convert_headers_to_markdown(text) + + # Step 5: Clean up external links (keep URL) + text = re.sub(r'\[([^\s\]]+)\s+([^\]]+)\]', r'[\2](\1)', text) + + # Step 6: Remove HTML comments + text = re.sub(r'', '', text, flags=re.DOTALL) + + # Step 7: Clean up references/citations markup + text = re.sub(r']*>.*?', '', text, flags=re.DOTALL) + text = re.sub(r']*/?>', '', text) + + # Step 8: Clean up excessive whitespace + text = re.sub(r'\n{3,}', '\n\n', text) # Max 2 newlines + text = re.sub(r' +', ' ', text) # Multiple spaces to single space + text = text.strip() + + # Step 9: Add infobox summary at the beginning if it exists + if infobox_summary: + text = infobox_summary + "\n\n" + text + + return text + + except Exception as e: + # If parsing fails, return original text with basic cleaning + print(f"Warning: Failed to parse wikitext: {e}") + text = wikitext + text = self.convert_headers_to_markdown(text) + text = re.sub(r'\[\[([^|\]]+\|)?([^\]]+)\]\]', r'\2', text) + return text + + def clean_page_data(self, page_data: Dict) -> Dict: + """ + Clean page data dictionary - adds cleaned_text field + + Args: + page_data: Dictionary containing page data with 'text' field + + Returns: + Updated dictionary with 'cleaned_text' field added + """ + if 'text' in page_data: + page_data['cleaned_text'] = self.clean_wikitext(page_data['text']) + else: + page_data['cleaned_text'] = "" + + return page_data + + +def clean_scraped_results(scraped_data: List[Dict]) -> List[Dict]: + """ + Clean a list of scraped page data + + Args: + scraped_data: List of page data dictionaries + + Returns: + List of cleaned page data dictionaries + """ + cleaner = WikitextCleaner() + cleaned_results = [] + + for page_data in scraped_data: + cleaned_page = cleaner.clean_page_data(page_data) + cleaned_results.append(cleaned_page) + + return cleaned_results + + +# Example usage +if __name__ == "__main__": + # Test the cleaner with sample wikitext + sample_wikitext = """ +{{Infobox university +| name = Indian Institute of Technology Kharagpur +| established = 1951 +| type = Public +| location = Kharagpur, West Bengal, India +}} + +==Introduction== +The '''Indian Institute of Technology Kharagpur''' ('''IIT Kharagpur''' or '''IIT KGP''') is a public technical university established by the government of India in [[Kharagpur]], [[West Bengal]], India. + +===History=== +It was established in 1951 and is the first of the [[IIT]]s to be established. + +==Campus== +The campus is spread over {{convert|2100|acre|km2}}. + +See also [[IIT Bombay]] and [[IIT Delhi|Delhi]] for more information. +""" + + cleaner = WikitextCleaner() + cleaned = cleaner.clean_wikitext(sample_wikitext) + + print("Original Wikitext:") + print("=" * 70) + print(sample_wikitext) + print("\n" + "=" * 70) + print("\nCleaned Markdown:") + print("=" * 70) + print(cleaned)