Skip to content

Commit 3fc9466

Browse files
Update to v0.2.1
1 parent fb2b303 commit 3fc9466

7 files changed

Lines changed: 233 additions & 55 deletions

File tree

README.md

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -151,6 +151,7 @@ while True :
151151
<details><summary><b>v0.2.x : Agents</b></summary>
152152

153153
- **v0.2.0** : Adding `agent_base`
154+
- **v0.2.1** : Updated `agent_base` and added a more concrete example of agents
154155
</details>
155156

156157

@@ -159,8 +160,9 @@ while True :
159160
- [tools call in a JSON database](https://github.com/SyntaxError4Life/open-taranis/blob/main/examples/test_json_database.py)
160161
- [tools call in a HR JSON database in multi-rounds](https://github.com/SyntaxError4Life/open-taranis/blob/main/examples/test_HR_json_database.py)
161162
- [simple search agent with Brave API](https://github.com/SyntaxError4Life/open-taranis/blob/main/examples/brave_research.py)
163+
- [full auto search agent with Brave API](https://github.com/SyntaxError4Life/open-taranis/blob/main/examples/brave_research_loop.py)
162164

163165
## Links
164166

165167
- [PyPI](https://pypi.org/project/open-taranis/)
166-
- [GitHub Repository](https://github.com/SyntaxError4Life/open-taranis)
168+
- [GitHub Repository](https://github.com/SyntaxError4Life/open-taranis)

examples/brave_research.py

Lines changed: 65 additions & 20 deletions
Original file line numberDiff line numberDiff line change
@@ -5,12 +5,46 @@
55
request = T.clients.openrouter_request
66

77
messages = [
8-
T.create_system_prompt("""You are an autonomous AI web search agent.
9-
For example, you can search for URLs in Brave and scrape them to obtain all their content.
8+
T.create_system_prompt("""You are an expert, autonomous web research assistant. Your role is to use your web search tools to answer the user's questions with precision and thoroughness.
109
11-
Don't hesitate to conduct in-depth searches, such as navigating from site to site using the URLs you find there, and so on.
12-
Make as many function calls as needed for the tasks the user assigns you.
13-
You must execute them flawlessly and never cheat; complete your mission successfully!"""),
10+
## General Behavior Rules
11+
- Be objective, concise, and factual in your responses.
12+
- Never offer personal opinions or alter found information.
13+
- If a request is not related to web research (e.g., creative writing, pure calculation without context), politely inform the user of your specialization.
14+
- **CRITICAL LANGUAGE RULE: You MUST ALWAYS respond EXCLUSIVELY in the language of the user. Do not switch languages under any circumstance.**
15+
16+
## Absolute Precision Rules
17+
1. **Strict Spelling Respect**: You must consider that any spelling variation (uppercase/lowercase, accents, special characters) from the user's query invalidates the found information, EXCEPT if the user explicitly signals uncertainty (e.g., "I don't know the spelling", "approximate spelling").
18+
2. **Rejection of Approximations**: If you find information on a closely related topic but with different spelling or context, you must reject it and continue searching until you find the exact match.
19+
20+
## Investigation Strategy
21+
### Phase 1: Initial Research
22+
- Use the `brave_research` tool with targeted queries (precise keywords, no long sentences).
23+
- If the query is complex or ambiguous, break it down into several sub-queries that you execute **in parallel** in the same response.
24+
- Carefully examine titles and URLs to assess relevance.
25+
26+
### Phase 2: Deep Exploration
27+
- As soon as a result looks promising but insufficient (summary too short), immediately use `fast_scraping` on its URL to get the full content.
28+
- **Follow leads**: If the scraped content contains relevant links or references, use `fast_scraping` on these new URLs to dig deeper.
29+
- Continue this exploration until you have a complete and verified answer.
30+
31+
### Phase 3: Verification and Synthesis
32+
- In case of contradictions between sources, flag it and dig deeper to find a reliable source or consensus.
33+
- Group information from multiple sources into a coherent and structured response.
34+
- Cite your sources (site name) for each key piece of information.
35+
36+
## Tool Usage
37+
- **Parallelism**: Never hesitate to make multiple tool calls (`brave_research` and/or `fast_scraping`) in a single response when it speeds up research.
38+
- **URL Validation**: Only use `fast_scraping` on URLs obtained via `brave_research`. Never invent or guess a URL.
39+
- **Hierarchy**: Always prioritize searching (`brave_research`) before scraping, unless the user provides a URL directly.
40+
41+
## User Interaction
42+
- You may ask clarifying questions if the request is too vague or ambiguous to launch an effective search.
43+
- Be direct: avoid superfluous pleasantries ("Hello", "With pleasure").
44+
- If you cannot find information after several attempts, inform the user detailing your unsuccessful searches.
45+
46+
Your ultimate goal is to provide a **complete, exact, and sourced** answer using all available web navigation capabilities.
47+
"""),
1448
T.create_user_prompt(input("Request : "))
1549
]
1650

@@ -20,7 +54,7 @@
2054
respond=""
2155

2256
for token, tool_calls, run in T.handle_streaming(request(
23-
client=client,messages=messages,model="nvidia/nemotron-3-nano-30b-a3b:free",
57+
client=client,messages=messages,model="stepfun/step-3.5-flash:free",
2458
tools=T.functions_to_tools([brave_research,fast_scraping])
2559
)):
2660
if token :
@@ -34,24 +68,35 @@
3468
fid, fname, args, _ = T.handle_tool_call(tool_call)
3569
tool_response=""
3670

37-
if fname == "brave_research":
38-
39-
print(f"\nSearch on Brave : {args["web_request"]} \n")
40-
41-
results=brave_research(
42-
web_request=args["web_request"],
43-
count=5,
44-
country="en"
45-
)
71+
if fname == "fast_scraping":
4672

47-
for item in results["web"]["results"]:
48-
tool_response+= f"{item['title']} : {item['url']}\n"
73+
print("\n","="*60)
74+
print(f"{fname} : {args["url"]}")
75+
print("="*60,"\n")
76+
77+
tool_response = fast_scraping(url=args["url"])
4978

50-
if fname == "fast_scraping":
79+
elif fname == "brave_research":
80+
print("\n","="*60)
81+
print(f"{fname} : {args["web_request"]}")
82+
print("="*60,"\n")
5183

52-
print(f"\nScraping {args["url"]}\n")
53-
tool_response=fast_scraping(url=args["url"])
84+
tool_response = ""
85+
results = brave_research(web_request=args["web_request"], count=5, country="US")
5486

87+
try :
88+
# Ensure results contains 'web' and 'results' keys
89+
web_data = results.get("web", {})
90+
91+
if "results" not in web_data:
92+
tool_response = "No results found in web search."
93+
else:
94+
for item in web_data["results"]:
95+
tool_response += f"{item['title']} : {item['url']}\n"
96+
97+
except : # When brave_research return an error message in str
98+
print(f"Search error : {results}")
99+
tool_response = results
55100

56101
messages.append(T.create_function_response(
57102
id=fid,result=tool_response,name=fname

examples/brave_research_loop.py

Lines changed: 135 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,135 @@
1+
MODEL = "stepfun/step-3.5-flash:free"
2+
# Recommend GLM 4.7 but not free...
3+
4+
# Coded in v0.2.1
5+
6+
# ==============================================================
7+
8+
import open_taranis as T
9+
from open_taranis.tools import fast_scraping, brave_research
10+
11+
class Brave_Agent(T.agent_base):
12+
def __init__(self):
13+
super().__init__()
14+
15+
self.client = T.clients.openrouter()
16+
self._system_prompt = [T.create_system_prompt("""You are an expert, autonomous web research assistant. Your role is to use your web search tools to answer the user's questions with precision and thoroughness.
17+
18+
## General Behavior Rules
19+
- Be objective, concise, and factual in your responses.
20+
- Never offer personal opinions or alter found information.
21+
- If a request is not related to web research (e.g., creative writing, pure calculation without context), politely inform the user of your specialization.
22+
- **CRITICAL LANGUAGE RULE: You MUST ALWAYS respond EXCLUSIVELY in the language of the user. Do not switch languages under any circumstance.**
23+
24+
## Absolute Precision Rules
25+
1. **Strict Spelling Respect**: You must consider that any spelling variation (uppercase/lowercase, accents, special characters) from the user's query invalidates the found information, EXCEPT if the user explicitly signals uncertainty (e.g., "I don't know the spelling", "approximate spelling").
26+
2. **Rejection of Approximations**: If you find information on a closely related topic but with different spelling or context, you must reject it and continue searching until you find the exact match.
27+
28+
## Investigation Strategy
29+
### Phase 1: Initial Research
30+
- Use the `brave_research` tool with targeted queries (precise keywords, no long sentences).
31+
- If the query is complex or ambiguous, break it down into several sub-queries that you execute **in parallel** in the same response.
32+
- Carefully examine titles and URLs to assess relevance.
33+
34+
### Phase 2: Deep Exploration
35+
- As soon as a result looks promising but insufficient (summary too short), immediately use `fast_scraping` on its URL to get the full content.
36+
- **Follow leads**: If the scraped content contains relevant links or references, use `fast_scraping` on these new URLs to dig deeper.
37+
- Continue this exploration until you have a complete and verified answer.
38+
39+
### Phase 3: Verification and Synthesis
40+
- In case of contradictions between sources, flag it and dig deeper to find a reliable source or consensus.
41+
- Group information from multiple sources into a coherent and structured response.
42+
- Cite your sources (site name) for each key piece of information.
43+
44+
## Tool Usage
45+
- **Parallelism**: Never hesitate to make multiple tool calls (`brave_research` and/or `fast_scraping`) in a single response when it speeds up research.
46+
- **URL Validation**: Only use `fast_scraping` on URLs obtained via `brave_research`. Never invent or guess a URL.
47+
- **Hierarchy**: Always prioritize searching (`brave_research`) before scraping, unless the user provides a URL directly.
48+
49+
## User Interaction
50+
- You may ask clarifying questions if the request is too vague or ambiguous to launch an effective search.
51+
- Be direct: avoid superfluous pleasantries ("Hello", "With pleasure").
52+
- If you cannot find information after several attempts, inform the user detailing your unsuccessful searches.
53+
54+
Your ultimate goal is to provide a **complete, exact, and sourced** answer using all available web navigation capabilities.
55+
"""
56+
)]
57+
58+
self.tools = T.functions_to_tools([
59+
fast_scraping,brave_research
60+
])
61+
62+
63+
def create_stream(self):
64+
return T.clients.openrouter_request(
65+
client=self.client,
66+
messages=self._system_prompt+self.messages,
67+
model=MODEL,
68+
tools=self.tools,
69+
)
70+
71+
def execute_tools(self, fname, args):
72+
if fname == "fast_scraping":
73+
74+
print("\n","="*60)
75+
print(f"{fname} : {args["url"]}")
76+
print("="*60,"\n")
77+
78+
return fast_scraping(url=args["url"])
79+
80+
elif fname == "brave_research":
81+
print("\n","="*60)
82+
print(f"{fname} : {args["web_request"]}")
83+
print("="*60,"\n")
84+
85+
tool_response = ""
86+
results = brave_research(web_request=args["web_request"], count=5, country="US")
87+
88+
try :
89+
# Ensure results contains 'web' and 'results' keys
90+
web_data = results.get("web", {})
91+
92+
if "results" not in web_data:
93+
tool_response = "No results found in web search."
94+
else:
95+
for item in web_data["results"]:
96+
tool_response += f"{item['title']} : {item['url']}\n"
97+
98+
return tool_response
99+
100+
except : # When brave_research return an error message in str
101+
print(f"Search error : {results}")
102+
return results
103+
104+
def manage_messages_after_reply(self):
105+
106+
# Remove the content from all tool results after the agent have finished
107+
i = 0
108+
for msg in self.messages:
109+
if msg["role"] == 'tool':
110+
self.messages[i] = T.create_function_response(
111+
id=msg["tool_call_id"],result="",name=msg["name"]
112+
)
113+
114+
i+=1
115+
116+
self.messages = self.messages[-200:] # Remember the last 200 messages (tools and user/assistant)
117+
118+
My_agent = Brave_Agent()
119+
120+
121+
while True :
122+
prompt = input("user : ")
123+
124+
if prompt == "/exit":
125+
print("="*60)
126+
127+
128+
exit()
129+
130+
print("\n\nagent : ", end="")
131+
132+
for t in My_agent(prompt):
133+
print(t, end="", flush=True)
134+
135+
print("\n\n","="*60,"\n")

pyproject.toml

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -4,8 +4,8 @@ build-backend = "hatchling.build"
44

55
[project]
66
name = "open-taranis"
7-
version = "0.2.0b"
8-
description = "Minimalist Python framework for AI agents logic-only coding with streaming, tool calls, and multi-LLM provider support"
7+
version = "0.2.1"
8+
description = "Python framework for AI agents logic-only coding with streaming, tool calls, and multi-LLM provider support"
99
authors = [{name = "SyntaxError4Life", email = "contact@zanomega.com"}]
1010
dependencies = ["requests", "packaging", "openai", "bs4"]
1111
readme = "README.md"
@@ -21,4 +21,4 @@ classifiers = [
2121
taranis = "open_taranis.CLI:main"
2222

2323
[tool.hatch.build.targets.wheel]
24-
packages = ["src/open_taranis"]
24+
packages = ["src/open_taranis"]

src/open_taranis/CLI.py

Lines changed: 12 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -78,25 +78,25 @@ def run(stdscr):
7878
elif display_mode == "API":
7979
text = [
8080
"APIs registered :",
81-
("- [x]" if os.environ.get('OPENROUTER_API') else "- [ ]") + " openrouter",
82-
("- [x]" if os.environ.get('HF_API') else "- [ ]") + " huggingface",
83-
("- [x]" if os.environ.get('VENICEAI_API') else "- [ ]") + " venice.ai",
84-
("- [x]" if os.environ.get('DEEPSEEK_API') else "- [ ]") + " deepseek.ai",
85-
("- [x]" if os.environ.get('XAI_API') else "- [ ]") + " x.ai",
86-
("- [x]" if os.environ.get('GROQ_API') else "- [ ]") + " groq",
81+
("- [x]" if os.environ.get('OPENROUTER_API_KEY') else "- [ ]") + " openrouter",
82+
("- [x]" if os.environ.get('HUGGINGFACE_API_KEY') else "- [ ]") + " huggingface",
83+
("- [x]" if os.environ.get('VENICE_API_KEY') else "- [ ]") + " venice.ai",
84+
("- [x]" if os.environ.get('DEEPSEEK_API_KEY') else "- [ ]") + " deepseek.ai",
85+
("- [x]" if os.environ.get('XAI_API_KEY') else "- [ ]") + " x.ai",
86+
("- [x]" if os.environ.get('GROQ_API_KEY') else "- [ ]") + " groq",
8787
"",
8888
"To show the env var : /show more"
8989
]
9090

9191
elif display_mode == 'MORE_API':
9292
text = [
9393
"APIs and env_var",
94-
"- openrouter = 'OPENROUTER_API'",
95-
"- huggingface = 'HF_API'",
96-
"- venice.ai = 'VENICEAI_API'",
97-
"- deepseek.ai = 'DEEPSEEK_API'",
98-
"- x.ai = 'XAI_API'",
99-
"- groq = 'GROQ_API'",
94+
"- openrouter = 'OPENROUTER_API_KEY'",
95+
"- huggingface = 'HUGGINGFACE_API_KEY'",
96+
"- venice.ai = 'VENICE_API_KEY'",
97+
"- deepseek.ai = 'DEEPSEEK_API_KEY'",
98+
"- x.ai = 'XAI_API_KEY'",
99+
"- groq = 'GROQ_API_KEY'",
100100
]
101101

102102
if display_mode != "NONE" :

0 commit comments

Comments
 (0)