The most efficient one-page LoRA trainer for Anima 2B. Optimized for 6GB+ VRAM, featuring a smart dataset analyzer and real-time previews.
-
Updated
May 15, 2026 - Python
The most efficient one-page LoRA trainer for Anima 2B. Optimized for 6GB+ VRAM, featuring a smart dataset analyzer and real-time previews.
Enhanced Prodigy optimizer with ~21% GPU speedup, independent D estimation for multi-component models, and improved stability. Drop-in replacement for Prodigy with same API.
Add a description, image, and links to the prodigy-optimizer topic page so that developers can more easily learn about it.
To associate your repository with the prodigy-optimizer topic, visit your repo's landing page and select "manage topics."