Skip to content

[Proposal] Bringing mlx-coder (On-device coding agent + Vision support) to the MLX ecosystem #469

@eduardogoncalves

Description

@eduardogoncalves

Hi everyone,

I’ve been building mlx-coder, an on-device coding agent built natively in Swift for Apple Silicon. My goal was to create a tool that leverages MLX for in-process inference, prioritizing privacy and performance by eliminating external servers and the overhead of Node.js or HTTP bridges.

I’m reaching out to see if the community feels this would be a good fit for the ml-explore organization or how I can best contribute to the mlx-swift ecosystem.

Key features:

  • Pure Swift & MLX: No Python bridges or heavy dependencies, just native performance.
  • Zero-Dependency Architecture: Bypasses the need for local web servers, making it lightweight and truly "local-first."
  • Interactive REPL: A chat interface tailored for rapid prototyping and agentic workflows.
  • Vision Support (VLM): I’ve recently integrated support for Qwen 3.5 and Gemma 4, enabling multimodal tasks like code generation from visual context.
  • Sandboxed & Secure: Leverages native macOS sandboxing to ensure a secure execution environment, providing peace of mind when the agent interacts with local files.

I believe mlx-coder can serve as a practical reference for developers building complex, agentic applications on top of mlx-swift.

I’d love to hear your thoughts on the best path forward. Should I keep this as a standalone community project, or is there interest in bringing this closer to the official ml-explore org?

Best regards,
Itamar Eduardo Gonçalves de Oliveira

Image Image

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions