You're offline - Playing from downloaded podcasts
Back to All Episodes
Podcast Episode

MiniMax M2.7: The AI That Helped Build Itself Sparks Open Source Debate

April 13, 2026

0:00
4:05
Podcast Thumbnail

Chinese AI startup MiniMax has released the weights of its M2.7 model, which controversially participated in its own development through a self-evolution process. However, the release has drawn criticism over licensing restrictions that some argue disqualify it from being called truly open source.

MiniMax Drops M2.7 Weights Amid Chinese AI Arms Race

Chinese AI startup MiniMax has released the weights of its flagship M2.7 model, entering a fierce competition among Chinese labs pushing open-weight releases just as DeepSeek prepares to unveil its highly anticipated V4 model later this month.

MiniMax M2.7 is a sparse mixture-of-experts model with two hundred and thirty billion parameters, designed to keep inference costs low while maintaining high capability. On the SWE-Pro benchmark, it scored fifty-six point two two percent, matching GPT-5.3-Codex for coding tasks. On VIBE-Pro, which measures end-to-end project delivery across web, mobile, and simulation tasks, it achieved fifty-five point six percent, putting it nearly on par with Claude Opus 4.6.

The Self-Evolution Breakthrough

What sets M2.7 apart is its role in its own creation. MiniMax describes it as the first model to participate in its own development cycle through a technique called self-evolution. The model ran autonomously for over one hundred rounds, analysing its own failure patterns, modifying its scaffold code, running evaluations, and deciding whether to keep or revert changes. This process yielded a thirty percent performance improvement on internal benchmarks.

Open Source in Name Only?

The release drew immediate backlash from the developer community. While MiniMax branded M2.7 as open source, its licence requires prior written permission for commercial use and mandates prominent attribution reading Built with MiniMax M2.7. Critics on Reddit and Hugging Face argue this violates fundamental open-source freedoms, calling it proprietary code with viewable weights rather than genuine open source.

The Broader Chinese AI Landscape

M2.7 arrives in a crowded field. Zhipu AI open-sourced its seven hundred and fifty-four billion parameter GLM-5.1 under the permissive MIT licence just days earlier. Meanwhile, all eyes are on DeepSeek V4, a trillion-parameter model expected in late April that will run on Huawei Ascend chips, marking a milestone in China's push for semiconductor self-sufficiency.

Published April 13, 2026 at 6:57pm

More Recent Episodes