Arcee AI Unveils Trinity Large: A 400B Sparse MoE Pushing Frontier Performance on a Budget
Arcee AI has launched Trinity Large, a massive 400B parameter Sparse Mixture-of-Experts (MoE) model, demonstrating frontier-level capabilities across reasoning and knowledge benchmarks. The release includes three critical checkpoints: a fast, chat-ready Preview, the fully pre-trained Base, and the pure research-focused TrueBase. This achievement was realized through extreme training efficiency, completing the 17T token run in just 33 days on 2048 B300 GPUs.
La Era