Projects done with the help of AI (Average Intelligence) tools have "Sloppy" in their name, while others never used it.
Don't get me wrong, AI is a great tool to get stuff done fast, but it's dum as hell and have to be carefully guided.
Current Transformers LLM architecture is basically T9 text prediction from old phones, and models are just dictionaries.
Repeatedly tap on your phone's text predictions - this is the current state of AI. We need "Conception-Text" models.
But autocomplete was always a good tool anyway. Now with proper expectations you're ready to start building.
Oh, BTW. Stop using cloud AI services, start with your own local LMStudio/ComfyUI machine. Save monies.
3 weeks of pure suffering and you're ready for a true/actual AI future, it'll pay off in less than a year.
Our videocards can not only run games, but write code. That's pretty cool right?
- ComfyUI-Enhancement-Utils - PC resource monitor and execution follower
- ComfyUI-SloppyAudio - Audio editing tools based on SoX and BS-RoFormer
- smol-caveman - Portable Caveman prompt designed for local LLMs. Read less slop and get much better results.
- ComfyUI-SloppyInstall.bat - Simplified pip install -r "requirements.txt" for custom nodes in portable ComfyUI.
- SloppyServer.bat - Single file local/Wi-Fi server for debugging multithreaded mobile Unity WebGL builds and other apps

