Skip to content

Use OLLAMA on a local network #2147

@emrothenberg

Description

@emrothenberg

Describe the bug

I've been trying to use Bolt.DIY+OLLAMA on my local network for so long, but since it's running all the requests from the client side I need to also expose OLLAMA to my local network. Can the requests be configured to be done server-side? Or at least proxied via the server?

Link to the Bolt URL that caused the error

/

Steps to reproduce

  1. On a machine within the same network (not the host machine), go to settings
  2. Click on local providers > OLLAMA
  3. Enter http://127.0.0.1:11434
  4. The instance fetches the OLLAMA instance (if exists) from the client machine

Expected behavior

The request to OLLAMA should be handled by the server so that clients could use the OLLAMA instance from the server

Screen Recording / Screenshot

No response

Platform

  • OS: Linux
  • Browser: Firefox, Chromium
  • Version: 1.0.0

Provider Used

OLLAMA

Model Used

No response

Additional context

I'm trying to run bolt.diy on my home server so I can use larger models while developing on my laptop.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions