Tongyi DeepResearch Emerges: Alibaba's Sparse 30B-Parameter AI for Complex Information Seeking
Alibaba's Tongyi Lab unveils Tongyi DeepResearch, a 30.5B-parameter sparse LLM activating just 3.3B parameters per token for efficient deep-information tasks. The model features automated synthetic data generation, continual agentic pre-training, and novel reinforcement learning techniques, outperforming benchmarks like Humanity's Last Exam and WebWalkerQA. Developers can now access the model via HuggingFace and ModelScope for agent-based research applications.