🤗 HuggingFaceSignificantQwen
Qwen3.6-35B-A3B: Open-Source Sparse MoE Model with Multimodal Agentic Capabilities
Qwen3.6-35B-A3B is a sparse Mixture-of-Experts (MoE) language model with 35B total parameters and 3B active parameters per token during inference. Released under Apache 2.0 license. Supports agentic coding workflows with…