You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
feat: Integrate Local LLM with multi-level fallback architecture
- Add TinyLlama 1.1B local model integration
- Implement OpenAI → Groq → Local LLM → Fallback Plan chain
- Enhance study plan generation with offline capability
- Improve translation fallback system
- Fix all Pylint issues to achieve 10.00/10 score
- Update README.md with comprehensive documentation
- Add local_llm.py service for offline operation
- Optimize model loading and error handling
- Add models to .gitignore to exclude large files
Copy file name to clipboardExpand all lines: README.md
+81-31Lines changed: 81 additions & 31 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,21 +1,61 @@
1
1
# EduPlannerBotAI
2
2
3
-
**EduPlannerBotAI** is a Telegram bot built with `aiogram 3.x` and powered by OpenAI GPT. It generates personalized study plans, exports them to PDF/TXT, and sends reminders as Telegram messages. All data is stored using TinyDB (no other DBs supported).
3
+
**EduPlannerBotAI** is a Telegram bot built with `aiogram 3.x` and powered by a multi-level LLM architecture. It generates personalized study plans, exports them to PDF/TXT, and sends reminders as Telegram messages. All data is stored using TinyDB (no other DBs supported).
4
4
5
5
> **Note:** All code comments and docstrings are in English for international collaboration and code clarity. All user-facing messages and buttons are automatically translated to the user's selected language.
6
6
7
7
## 📌 Features
8
8
9
-
- 📚 Generate personalized study plans (LLM/OpenAI, automatic fallback to Groq if OpenAI unavailable)
9
+
- 📚 Generate personalized study plans using multi-level LLM architecture
10
10
- 📝 Export study plans to PDF/TXT
11
11
- ⏰ Send reminders as Telegram messages for each study step
12
12
- 🗄️ Store data using TinyDB (no SQL/other DBs)
13
-
- 🌐 Multilingual: English, Russian, Spanish — all messages, buttons, and files are translated in real time using LLMs (OpenAI or Groq)
13
+
- 🌐 Multilingual: English, Russian, Spanish — all messages, buttons, and files are translated in real time using LLMs
14
14
- 🏷️ All keyboards are always shown with a short message, ensuring buttons are reliably displayed
15
15
- ❌ No empty or invisible messages — all user-facing text is always non-empty (prevents Telegram errors)
16
16
- 🔄 Language selection buttons are not translated, so the language filter works correctly
17
17
- 🤖 If translation is not possible, the original English text is sent
18
18
- 🧩 Simple, maintainable, idiomatic codebase — ready for extension
19
+
- 🚀 **NEW**: Local LLM integration for offline operation and guaranteed availability
20
+
21
+
## 🆕 Multi-Level LLM Architecture
22
+
23
+
The bot now features a sophisticated multi-level fallback system that ensures reliable service even when external APIs are unavailable:
24
+
25
+
### LLM Processing Chain
26
+
27
+
1.**OpenAI GPT** (Priority 1) - Primary model for generating study plans
28
+
2.**Groq** (Priority 2) - Secondary model, used if OpenAI is unavailable
29
+
3.**Local LLM** (Priority 3) - Local TinyLlama 1.1B model for offline operation
30
+
4.**Fallback Plan** (Priority 4) - Predefined high-quality study plan template
31
+
32
+
### How It Works
33
+
34
+
The bot automatically attempts to generate study plans using available services in order of priority:
35
+
36
+
1.**Primary**: OpenAI API (if `OPENAI_API_KEY` is set and quota is available)
37
+
2.**Fallback 1**: [Groq](https://groq.com/) (if `GROQ_API_KEY` is set)
38
+
3.**Fallback 2**: Local LLM (TinyLlama 1.1B model)
39
+
4.**Last Resort**: Local plan generator (comprehensive template)
40
+
41
+
### Local LLM Integration
42
+
43
+
The bot now includes a local TinyLlama 1.1B model that provides:
44
+
45
+
-**Offline Operation**: Works without internet connection
46
+
-**Fast Response**: No network latency
47
+
-**Privacy**: All processing happens locally
48
+
-**Guaranteed Availability**: Always accessible as fallback
49
+
-**High Quality**: Professional-grade study plan generation
50
+
51
+
### Translation Fallback
52
+
53
+
The same multi-level system applies to text translation:
54
+
55
+
1.**OpenAI** for high-quality translations
56
+
2.**Groq** as secondary translation service
57
+
3.**Local LLM** for offline translation capability
58
+
4.**Original Text** if all translation services fail
19
59
20
60
## 🆕 Groq Fallback Integration
21
61
@@ -26,14 +66,6 @@ If the OpenAI API is unavailable, out of quota, or not configured, the bot will
26
66
-**OpenAI-compatible API**
27
67
-**Always available fallback**
28
68
29
-
If both OpenAI and Groq are unavailable, the bot falls back to a local plan generator (simple stub).
30
-
31
-
### How it works
32
-
33
-
1.**Primary:** OpenAI API (if `OPENAI_API_KEY` is set and quota is available)
34
-
2.**Fallback:**[Groq](https://groq.com/) (if `GROQ_API_KEY` is set)
35
-
3.**Last resort:** Local plan generator (simple stub)
36
-
37
69
### How to use Groq
38
70
39
71
1. Register and get your API key at [Groq](https://console.groq.com/keys).
@@ -50,14 +82,14 @@ No other changes are needed — the bot will automatically use Groq if OpenAI is
50
82
51
83
## 🌐 Multilingual Support
52
84
53
-
You can choose your preferred language for all bot interactions! Use the `/language` command to select from English, Russian, or Spanish. The bot will automatically translate all responses, study plans, and reminders to your chosen language using LLMs (OpenAI or Groq fallback). If translation is not possible, the original English text will be sent.
85
+
You can choose your preferred language for all bot interactions! Use the `/language` command to select from English, Russian, or Spanish. The bot will automatically translate all responses, study plans, and reminders to your chosen language using the multi-level LLM system.
54
86
55
87
**Supported languages:**
56
88
- English (`en`)
57
89
- Русский (`ru`)
58
90
- Español (`es`)
59
91
60
-
Translations are performed in real time using the same LLMs that generate study plans, ensuring high-quality and context-aware results. Fallback to Groq is supported for both generation and translation if OpenAI is unavailable.
92
+
Translations are performed in real time using the same LLM architecture that generates study plans, ensuring high-quality and context-aware results. The system automatically falls back through available services to provide the best possible translation quality.
- Python version compatibility: 3.10, 3.11, 3.12, 3.13
150
197
- Custom `.pylintrc` configuration
151
198
152
-
## 📝 Release 3.0.0 Highlights
153
-
154
-
- All user-facing messages and buttons always contain non-empty text, eliminating Telegram errors (Bad Request: text must be non-empty).
155
-
- Keyboards (format selection, next actions) are always accompanied by a short message to ensure buttons are displayed reliably.
156
-
- Language selection buttons are not translated, so the language filter works correctly.
157
-
- The entire bot scenario is fully localized: all messages, buttons, and files are translated to the user's selected language (English, Russian, Spanish).
158
-
- Multilingual support is powered by LLM-based translation (OpenAI or Groq fallback).
159
-
- Fallback to Groq is supported for both generation and translation if OpenAI is unavailable.
160
-
- If translation is not possible, the original English text is sent.
161
-
- Codebase is fully in English (comments, docstrings, messages), PEP8 and pylint compliant (score 10/10).
162
-
- 100% test coverage for all core logic and handlers (pytest).
163
-
- Logic is maximally simplified, with no unnecessary conditions; all stages work reliably and predictably.
164
-
- Project is ready for open source use and easy extension.
199
+
## 📝 Release 4.0.0 Highlights
200
+
201
+
-**Multi-Level LLM Architecture**: OpenAI → Groq → Local LLM → Fallback Plan
202
+
-**Local LLM Integration**: TinyLlama 1.1B model for offline operation
203
+
-**Guaranteed Availability**: Bot works even without internet connection
204
+
-**Enhanced Fallback System**: Robust error handling and service switching
205
+
-**Improved Plan Quality**: Professional-grade study plan templates
206
+
-**Offline Translation**: Local LLM supports offline text translation
207
+
-**Performance Optimization**: Efficient model loading and inference
208
+
-**Comprehensive Logging**: Detailed monitoring of LLM service transitions
209
+
- All user-facing messages and buttons always contain non-empty text, eliminating Telegram errors
210
+
- The entire bot scenario is fully localized with multi-level translation fallback
211
+
- Codebase is fully in English (comments, docstrings, messages), PEP8 and pylint compliant
212
+
- 100% test coverage for all core logic and handlers
213
+
- Project is ready for production use and easy extension
165
214
166
215
## ⚠️ Handling Frequent 429 Errors
167
216
@@ -171,6 +220,7 @@ If you're experiencing too many `429 Too Many Requests` errors, consider the fol
171
220
* 🔁 Increase `MAX_RETRIES`
172
221
* 🧠 Use a lighter OpenAI model (e.g., `gpt-3.5-turbo` instead of `gpt-4`)
173
222
* 💳 Upgrade your OpenAI plan to one with a higher request quota
223
+
* 🚀 **NEW**: The bot will automatically fall back to Groq and Local LLM to maintain service availability
0 commit comments