diff --git a/README.md b/README.md index fd04248..433d23c 100644 --- a/README.md +++ b/README.md @@ -19,7 +19,7 @@ The MCP server currently supports the below Cortex AI capabilities: A simple configuration file is used to create tooling for the various Cortex AI features. An example can be seen at [services/tools_config.yaml](services/tools_config.yaml) and a template is below. Many Cortex Search and Cortex Analyst services can be added. Ideal descriptions are both highly descriptive and mutually exclusive. The path to this configuration file will be passed to the server and the contents used to create MCP server tools at startup. ``` -cortex_complete: # Set default model if one is not specified by user in Cortex Copmlete tool +cortex_complete: # Set default model if one is not specified by user in Cortex Complete tool default_model: "snowflake-llama-3.3-70b" search_services: # List all Cortex Search services - service_name: "" diff --git a/services/tools_config.yaml b/services/tools_config.yaml index fe97c26..9e37af6 100644 --- a/services/tools_config.yaml +++ b/services/tools_config.yaml @@ -1,4 +1,4 @@ -cortex_complete: # Set default model if one is not specified by user in Cortex Copmlete tool +cortex_complete: # Set default model if one is not specified by user in Cortex Complete tool default_model: "snowflake-llama-3.3-70b" search_services: # List all Cortex Search services - service_name: ""