| layout | default |
|---|---|
| title | Chapter 3: LLM Provider Configuration |
| nav_order | 3 |
| parent | Devika Tutorial |
Welcome to Chapter 3: LLM Provider Configuration. In this part of Devika Tutorial: Open-Source Autonomous AI Software Engineer, you will build an intuitive mental model first, then move into concrete implementation details and practical production tradeoffs.
This chapter covers how to configure Claude 3, GPT-4, Gemini, Mistral, Groq, and local Ollama models in Devika's config.toml and how to select the right provider for each agent role.
- configure API keys and model identifiers for every supported LLM provider
- understand Devika's model selection mechanism and how to switch providers per project
- evaluate the cost, latency, and quality tradeoffs across providers for autonomous coding tasks
- configure Ollama for fully offline, local LLM operation without external API keys
- open
config.tomland locate the[API_KEYS]and[API_MODELS]sections - add your API key for at least one cloud provider (Claude, OpenAI, Google, Mistral, or Groq)
- set the model name for each provider section to a currently available model identifier
- optionally install and start Ollama with a code-capable model for local operation
You now know how to configure any of Devika's supported LLM providers, select the right model for each use case, and operate Devika in fully local mode using Ollama.
Next: Chapter 4: Task Planning and Code Generation
The real_time_logs function in devika.py handles a key part of this chapter's functionality:
@app.route("/api/logs", methods=["GET"])
def real_time_logs():
log_file = logger.read_log_file()
return jsonify({"logs": log_file})
@app.route("/api/settings", methods=["POST"])
@route_logger(logger)
def set_settings():
data = request.json
config.update_config(data)
return jsonify({"message": "Settings updated"})
@app.route("/api/settings", methods=["GET"])
@route_logger(logger)
def get_settings():
configs = config.get_config()
return jsonify({"settings": configs})
@app.route("/api/status", methods=["GET"])
@route_logger(logger)
def status():
return jsonify({"status": "server is running!"})
if __name__ == "__main__":
logger.info("Devika is up and running!")
socketio.run(app, debug=False, port=1337, host="0.0.0.0")This function is important because it defines how Devika Tutorial: Open-Source Autonomous AI Software Engineer implements the patterns covered in this chapter.
The set_settings function in devika.py handles a key part of this chapter's functionality:
@app.route("/api/settings", methods=["POST"])
@route_logger(logger)
def set_settings():
data = request.json
config.update_config(data)
return jsonify({"message": "Settings updated"})
@app.route("/api/settings", methods=["GET"])
@route_logger(logger)
def get_settings():
configs = config.get_config()
return jsonify({"settings": configs})
@app.route("/api/status", methods=["GET"])
@route_logger(logger)
def status():
return jsonify({"status": "server is running!"})
if __name__ == "__main__":
logger.info("Devika is up and running!")
socketio.run(app, debug=False, port=1337, host="0.0.0.0")This function is important because it defines how Devika Tutorial: Open-Source Autonomous AI Software Engineer implements the patterns covered in this chapter.
The get_settings function in devika.py handles a key part of this chapter's functionality:
@app.route("/api/settings", methods=["GET"])
@route_logger(logger)
def get_settings():
configs = config.get_config()
return jsonify({"settings": configs})
@app.route("/api/status", methods=["GET"])
@route_logger(logger)
def status():
return jsonify({"status": "server is running!"})
if __name__ == "__main__":
logger.info("Devika is up and running!")
socketio.run(app, debug=False, port=1337, host="0.0.0.0")This function is important because it defines how Devika Tutorial: Open-Source Autonomous AI Software Engineer implements the patterns covered in this chapter.
The status function in devika.py handles a key part of this chapter's functionality:
@app.route("/api/status", methods=["GET"])
@route_logger(logger)
def status():
return jsonify({"status": "server is running!"})
if __name__ == "__main__":
logger.info("Devika is up and running!")
socketio.run(app, debug=False, port=1337, host="0.0.0.0")This function is important because it defines how Devika Tutorial: Open-Source Autonomous AI Software Engineer implements the patterns covered in this chapter.
flowchart TD
A[real_time_logs]
B[set_settings]
C[get_settings]
D[status]
E[Config]
A --> B
B --> C
C --> D
D --> E