协会地址:上海市长宁区古北路620号图书馆楼309-313室

Towards AI4S 2.0: Shanghai AI Lab Open-Sources Intern-S1-Pro, the 1-Trillion-Parameter MoE Scientific Large Model
On February 4, 2026, Shanghai Artificial Intelligence Laboratory open-sourced Intern-S1-Pro, a 1-trillion-parameter multi-modal scientific large model based on the SAGE architecture (integration of general and specialized capabilities). As the world’s largest open-source multi-modal scientific model to date, it adopts a MoE architecture with 512 experts, activating only 8 experts (22B parameters) per inference. Key architectural innovations include Fourier Position Embedding (FoPE) and an efficient routing mechanism, addressing ultra-large-scale model training bottlenecks. Intern-S1-Pro achieves internationally leading performance in AI4S interdisciplinary evaluations, gold medal level in Olympiad mathematical reasoning, and top-tier agent capabilities. It covers five core scientific disciplines, realizes synergistic advancement of general and scientific capabilities, and builds a unified "computing power-algorithm" foundation with domestic ecosystems. Open-sourced on GitHub, HuggingFace and other platforms, it aims to lower global research thresholds and drive AGI-powered scientific discovery paradigm shifts, contributing to the construction of AGI4S infrastructure and an open scientific AI ecosystem.

