### Prerequisites - [x] I am running the latest code. Mention the version if possible as well. - [x] I carefully followed the [README.md](https://github.com/ggml-org/llama.cpp/blob/master/README.md). - [x] I searched using keywords relevant to my issue to make sure that I am creating a new issue that is not already open (or closed). - [x] I reviewed the [Discussions](https://github.com/ggml-org/llama.cpp/discussions), and have a new and useful enhancement to share. ### Feature Description Support for MCP ### Motivation Hello, Do you plan to include support for MCP? That would be great to have MCP. ### Possible Implementation _No response_
Prerequisites
Feature Description
Support for MCP
Motivation
Hello,
Do you plan to include support for MCP?
That would be great to have MCP.
Possible Implementation
No response