Is there away to query your custom rug for context and data so that you can use LLM to answer based on retrieved data?
Is there away to query your custom rug for context and data so that you can use LLM to answer based on retrieved data?