[Showcase] Eco-Metal — 63 Modular Plugins for Advanced LLM Inference natively on MSL #3403
helgklaizar
started this conversation in
Show and tell
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Title: Show and Tell: Eco-Metal — 63 Modular Plugins for Advanced LLM Inference natively on MSL
Body:
Hey MLX team! First of all, huge thanks for the amazing work on MLX. Apple Silicon is absolutely game-changing for local AI.
I've spent the past few months heavily utilizing your framework to build Eco-Metal (
https://github.com/helgklaizar/Eco-Metal), a production-ready ecosystem of 63 modular AI components fully optimized for Mac.Our main focus was to eliminate slow Python overhead and CUDA-wrappers. We've successfully ported and hardened custom Metal Shading Language (MSL) kernels and native
mx.fastpaths for several SOTA algorithms:We ensure 100% test coverage and native JIT execution via
mx.fast.metal_kernelandmx.fast.scaled_dot_product_attention.I would be incredibly honored if someone from the core AMLR team (or anyone else!) could check out the project. I'm completely open to submitting any of our high-performance Metal MSL kernels directly upstream to MLX if you feel they fit the primary framework requirements!
Ecosystem Repository: https://github.com/helgklaizar/Eco-Metal
Cheers, and keep up the brilliant work!
Beta Was this translation helpful? Give feedback.
All reactions