@eoba
Joined February 2026
Projects
General purpose reasoning and chat model trained by NVIDIA
MODEL
1
A hybrid MoE model trained for tool use from IBM.
MODEL
2
Lightweight Gemma 3-based model (270M params) trained specifically for function calling. Text-only with a 32k context window, designed to be fine-tuned into your own tool agent while remaining small enough for laptops or edge devices.
MODEL
1