MiniMax unveils its own open source LLM with industry-leading 4M token context

3 weeks ago 78

January 14, 2025 3:46 PM

Credit: VentureBeat made with Midjourney

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More

MiniMax is perhaps today best known here in the U.S. as the Singaporean company behind Hailuo, a realistic, high-resolution generative AI video model that competes with Runway, OpenAI’s Sora and Luma AI’s Dream Machine.

But the company has far more tricks up its sleeve: today, for instance, it announced the release and open-sourcing of the MiniMax-01 series, a new family of models built to handle ultra-long contexts and enhance AI agent development.

The series includes MiniMax-Text-01, a foundation large language model (LLM), and MiniMax-VL-01, a visual multi-modal model.

A massive context window

The LLM, MiniMax-Text-o1, is of particular note for enabling up to 4 million tokens in its context window — equivalent to a small library worth of books. The context window is how much information the LLM can handle in one input/output exchange, with words and concepts represented as numerical “tokens,” the LLM’s own interna...

Read Entire Article