Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/unittest.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ on: [push, pull_request]
jobs:
quick-test:
runs-on: ubuntu-latest

steps:
- name: Checkout code
uses: actions/checkout@v4
Expand Down
2 changes: 1 addition & 1 deletion docker/Dockerfile
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
FROM python:3.11-slim
WORKDIR /app
COPY . .
RUN pip install flask praisonai==2.2.11 gunicorn markdown
RUN pip install flask praisonai==2.2.12 gunicorn markdown
EXPOSE 8080
CMD ["gunicorn", "-b", "0.0.0.0:8080", "api:app"]
2 changes: 1 addition & 1 deletion docker/Dockerfile.chat
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
RUN pip install --no-cache-dir \
praisonaiagents>=0.0.4 \
praisonai_tools \
"praisonai==2.2.11" \
"praisonai==2.2.12" \
"praisonai[chat]" \
"embedchain[github,youtube]"

Expand Down
2 changes: 1 addition & 1 deletion docker/Dockerfile.dev
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
RUN pip install --no-cache-dir \
praisonaiagents>=0.0.4 \
praisonai_tools \
"praisonai==2.2.11" \
"praisonai==2.2.12" \
"praisonai[ui]" \
"praisonai[chat]" \
"praisonai[realtime]" \
Expand Down
2 changes: 1 addition & 1 deletion docker/Dockerfile.ui
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
RUN pip install --no-cache-dir \
praisonaiagents>=0.0.4 \
praisonai_tools \
"praisonai==2.2.11" \
"praisonai==2.2.12" \
"praisonai[ui]" \
"praisonai[crewai]"

Expand Down
2 changes: 1 addition & 1 deletion docs/api/praisonai/deploy.html
Original file line number Diff line number Diff line change
Expand Up @@ -110,7 +110,7 @@ <h2 id="raises">Raises</h2>
file.write(&#34;FROM python:3.11-slim\n&#34;)
file.write(&#34;WORKDIR /app\n&#34;)
file.write(&#34;COPY . .\n&#34;)
file.write(&#34;RUN pip install flask praisonai==2.2.11 gunicorn markdown\n&#34;)
file.write(&#34;RUN pip install flask praisonai==2.2.12 gunicorn markdown\n&#34;)
file.write(&#34;EXPOSE 8080\n&#34;)
file.write(&#39;CMD [&#34;gunicorn&#34;, &#34;-b&#34;, &#34;0.0.0.0:8080&#34;, &#34;api:app&#34;]\n&#39;)

Expand Down
2 changes: 1 addition & 1 deletion docs/developers/local-development.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ WORKDIR /app

COPY . .

RUN pip install flask praisonai==2.2.11 watchdog
RUN pip install flask praisonai==2.2.12 watchdog

EXPOSE 5555

Expand Down
2 changes: 1 addition & 1 deletion docs/ui/chat.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -155,7 +155,7 @@ To facilitate local development with live reload, you can use Docker. Follow the

COPY . .

RUN pip install flask praisonai==2.2.11 watchdog
RUN pip install flask praisonai==2.2.12 watchdog

EXPOSE 5555

Expand Down
2 changes: 1 addition & 1 deletion docs/ui/code.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -208,7 +208,7 @@ To facilitate local development with live reload, you can use Docker. Follow the

COPY . .

RUN pip install flask praisonai==2.2.11 watchdog
RUN pip install flask praisonai==2.2.12 watchdog

EXPOSE 5555

Expand Down
18 changes: 1 addition & 17 deletions praisonai/agents_generator.py
Original file line number Diff line number Diff line change
Expand Up @@ -438,41 +438,25 @@ def _run_crewai(self, config, topic, tools_dict):
if llm_model:
llm = PraisonAIModel(
model=llm_model.get("model") or os.environ.get("MODEL_NAME") or "openai/gpt-4o",
api_key_var=None, # Don't rely on env var lookup
base_url=self.config_list[0].get('base_url') if self.config_list else None
).get_model()
# Override with explicit API key from config_list
if self.config_list and self.config_list[0].get('api_key'):
llm.api_key = self.config_list[0]['api_key']
Comment on lines 438 to -446
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

It looks like the logic to explicitly override the LLM's API key using self.config_list[0].get('api_key') after creating the PraisonAIModel instance has been removed here and in subsequent blocks for function_calling_llm.

Previously, if a user provided an api_key directly in the config_list entry within their agents.yaml (or equivalent config), that key would be used to set the llm.api_key attribute. With this change, the PraisonAIModel will now rely solely on its internal logic to determine the API key, which primarily involves looking up environment variables based on the model prefix (e.g., OPENAI_API_KEY, GROQ_API_KEY, etc.).

This change simplifies the code but potentially breaks backward compatibility for users who were providing API keys directly in their YAML configuration config_list entries instead of environment variables. Should we document this change in behavior or consider restoring the ability to load the API key from config_list for backward compatibility?

else:
llm = PraisonAIModel(
api_key_var=None, # Don't rely on env var lookup
base_url=self.config_list[0].get('base_url') if self.config_list else None
).get_model()
# Override with explicit API key from config_list
if self.config_list and self.config_list[0].get('api_key'):
llm.api_key = self.config_list[0]['api_key']

# Configure function calling LLM
function_calling_llm_model = details.get('function_calling_llm')
if function_calling_llm_model:
function_calling_llm = PraisonAIModel(
model=function_calling_llm_model.get("model") or os.environ.get("MODEL_NAME") or "openai/gpt-4o",
api_key_var=None, # Don't rely on env var lookup
base_url=self.config_list[0].get('base_url') if self.config_list else None
).get_model()
# Override with explicit API key from config_list
if self.config_list and self.config_list[0].get('api_key'):
function_calling_llm.api_key = self.config_list[0]['api_key']
else:
function_calling_llm = PraisonAIModel(
api_key_var=None, # Don't rely on env var lookup
base_url=self.config_list[0].get('base_url') if self.config_list else None
).get_model()
# Override with explicit API key from config_list
if self.config_list and self.config_list[0].get('api_key'):
function_calling_llm.api_key = self.config_list[0]['api_key']


# Create CrewAI agent
agent = Agent(
role=role_filled,
Expand Down
2 changes: 1 addition & 1 deletion praisonai/deploy.py
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@ def create_dockerfile(self):
file.write("FROM python:3.11-slim\n")
file.write("WORKDIR /app\n")
file.write("COPY . .\n")
file.write("RUN pip install flask praisonai==2.2.11 gunicorn markdown\n")
file.write("RUN pip install flask praisonai==2.2.12 gunicorn markdown\n")
file.write("EXPOSE 8080\n")
file.write('CMD ["gunicorn", "-b", "0.0.0.0:8080", "api:app"]\n')

Expand Down
4 changes: 2 additions & 2 deletions pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[project]
name = "PraisonAI"
version = "2.2.11"
version = "2.2.12"
description = "PraisonAI is an AI Agents Framework with Self Reflection. PraisonAI application combines PraisonAI Agents, AutoGen, and CrewAI into a low-code solution for building and managing multi-agent LLM systems, focusing on simplicity, customisation, and efficient human-agent collaboration."
readme = "README.md"
license = ""
Expand Down Expand Up @@ -89,7 +89,7 @@ autogen = ["pyautogen>=0.2.19", "praisonai-tools>=0.0.15", "crewai"]

[tool.poetry]
name = "PraisonAI"
version = "2.2.11"
version = "2.2.12"
description = "PraisonAI is an AI Agents Framework with Self Reflection. PraisonAI application combines PraisonAI Agents, AutoGen, and CrewAI into a low-code solution for building and managing multi-agent LLM systems, focusing on simplicity, customisation, and efficient human-agent collaboration."
authors = ["Mervin Praison"]
license = ""
Expand Down
2 changes: 1 addition & 1 deletion uv.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Loading