|
| 1 | +--- |
| 2 | +title: Superlinked |
| 3 | +sidebarTitle: Superlinked |
| 4 | +--- |
| 5 | + |
| 6 | +[Superlinked](https://superlinked.com) is a self-hosted inference engine (SIE) for embedding, reranking, and extraction. The `sie-lancedb` package registers SIE as a first-class embedding function in LanceDB's embeddings registry, so embeddings are computed automatically on insert and search. You need a running SIE instance - see the [Superlinked quickstart](https://superlinked.com/docs) for deployment options. |
| 7 | + |
| 8 | +## Installation |
| 9 | + |
| 10 | +<CodeGroup> |
| 11 | +```bash Python |
| 12 | +pip install sie-lancedb |
| 13 | +``` |
| 14 | + |
| 15 | +```bash TypeScript |
| 16 | +npm install @superlinked/sie-lancedb @lancedb/lancedb |
| 17 | +``` |
| 18 | +</CodeGroup> |
| 19 | + |
| 20 | +## Registered functions |
| 21 | + |
| 22 | +Importing `sie_lancedb` registers two embedding functions in LanceDB's registry: |
| 23 | + |
| 24 | +| Name | Purpose | |
| 25 | +|---|---| |
| 26 | +| `"sie"` | Dense text embeddings | |
| 27 | +| `"sie-multivector"` | ColBERT-style late interaction with MaxSim scoring | |
| 28 | + |
| 29 | +Supported parameters on `.create()`: |
| 30 | + |
| 31 | +| Parameter | Type | Description | |
| 32 | +|---|---|---| |
| 33 | +| `model` | `str` | Any of 85+ SIE-supported models (e.g. `BAAI/bge-m3`, `NovaSearch/stella_en_400M_v5`, `colbert-ir/colbertv2.0`) | |
| 34 | +| `base_url` | `str` | URL of the SIE endpoint (e.g. `http://localhost:8080`) | |
| 35 | + |
| 36 | +## Usage |
| 37 | + |
| 38 | +```python |
| 39 | +import lancedb |
| 40 | +from lancedb.embeddings import get_registry |
| 41 | +from lancedb.pydantic import LanceModel, Vector |
| 42 | +import sie_lancedb # registers "sie" and "sie-multivector" |
| 43 | + |
| 44 | +sie = get_registry().get("sie").create( |
| 45 | + model="BAAI/bge-m3", |
| 46 | + base_url="http://localhost:8080", |
| 47 | +) |
| 48 | + |
| 49 | +class Documents(LanceModel): |
| 50 | + text: str = sie.SourceField() |
| 51 | + vector: Vector(sie.ndims()) = sie.VectorField() |
| 52 | + |
| 53 | +db = lancedb.connect("~/.lancedb") |
| 54 | +table = db.create_table("docs", schema=Documents, mode="overwrite") |
| 55 | + |
| 56 | +table.add([ |
| 57 | + {"text": "Machine learning is a subset of AI."}, |
| 58 | + {"text": "Neural networks use multiple layers."}, |
| 59 | + {"text": "Python is popular for ML development."}, |
| 60 | +]) |
| 61 | + |
| 62 | +results = table.search("What is deep learning?").limit(3).to_list() |
| 63 | +``` |
| 64 | + |
| 65 | +LanceDB handles embedding generation for both inserts and queries automatically, based on the `SourceField` / `VectorField` declarations on the schema. |
| 66 | + |
| 67 | +## Hybrid search with reranker |
| 68 | + |
| 69 | +`SIEReranker` plugs into LanceDB's hybrid search pipeline. It uses SIE's cross-encoder `score()` to rerank combined vector + full-text search results. You need a full-text search index on the column first: |
| 70 | + |
| 71 | +```python |
| 72 | +from sie_lancedb import SIEReranker |
| 73 | + |
| 74 | +# Create FTS index for hybrid search |
| 75 | +table.create_fts_index("text", replace=True) |
| 76 | + |
| 77 | +results = ( |
| 78 | + table.search("What is deep learning?", query_type="hybrid") |
| 79 | + .rerank(SIEReranker(model="jinaai/jina-reranker-v2-base-multilingual")) |
| 80 | + .limit(5) |
| 81 | + .to_list() |
| 82 | +) |
| 83 | + |
| 84 | +for r in results: |
| 85 | + print(f"{r['_relevance_score']:.3f} {r['text']}") |
| 86 | +``` |
| 87 | + |
| 88 | +The reranker also works with pure vector or pure FTS search via `.rerank()`. |
| 89 | + |
| 90 | +## ColBERT / multivector |
| 91 | + |
| 92 | +`SIEMultiVectorEmbeddingFunction` (registered as `"sie-multivector"`) works with LanceDB's native `MultiVector` type and MaxSim scoring for ColBERT and ColPali models: |
| 93 | + |
| 94 | +```python |
| 95 | +from lancedb.pydantic import MultiVector |
| 96 | + |
| 97 | +sie_colbert = get_registry().get("sie-multivector").create( |
| 98 | + model="jinaai/jina-colbert-v2", |
| 99 | + base_url="http://localhost:8080", |
| 100 | +) |
| 101 | + |
| 102 | +class ColBERTDocs(LanceModel): |
| 103 | + text: str = sie_colbert.SourceField() |
| 104 | + vector: MultiVector(sie_colbert.ndims()) = sie_colbert.VectorField() |
| 105 | + |
| 106 | +table = db.create_table("colbert_docs", schema=ColBERTDocs, mode="overwrite") |
| 107 | +table.add([{"text": "Machine learning is a subset of AI."}]) |
| 108 | + |
| 109 | +# MaxSim search - query and document multivectors compared token-by-token |
| 110 | +results = table.search("What is ML?").limit(5).to_list() |
| 111 | +``` |
| 112 | + |
| 113 | +## Entity extraction |
| 114 | + |
| 115 | +`SIEExtractor` adds entity extraction to LanceDB's data-enrichment workflows. Extract entities from a text column and merge the results back as a structured Arrow column - enabling filtered search on extracted entities: |
| 116 | + |
| 117 | +```python |
| 118 | +from sie_lancedb import SIEExtractor |
| 119 | + |
| 120 | +extractor = SIEExtractor( |
| 121 | + base_url="http://localhost:8080", |
| 122 | + model="urchade/gliner_multi-v2.1", |
| 123 | +) |
| 124 | + |
| 125 | +extractor.enrich_table( |
| 126 | + table, |
| 127 | + source_column="text", |
| 128 | + target_column="entities", |
| 129 | + labels=["person", "technology", "organization"], |
| 130 | + id_column="id", |
| 131 | +) |
| 132 | +``` |
| 133 | + |
| 134 | +The `entities` column stores structured Arrow data (`list<struct<text, label, score, start, end, bbox>>`), so you can filter on extracted entities in queries. |
| 135 | + |
| 136 | +## Links |
| 137 | + |
| 138 | +- [`sie-lancedb` on PyPI](https://pypi.org/project/sie-lancedb/) |
| 139 | +- [`@superlinked/sie-lancedb` on npm](https://www.npmjs.com/package/@superlinked/sie-lancedb) |
| 140 | +- [Superlinked on GitHub](https://github.com/superlinked/sie) |
| 141 | +- [Superlinked docs](https://superlinked.com/docs) |
0 commit comments