Tencent Open Source Hunyuan-A13B Model
Jin10 data reported on June 27th, on June 27th, Tencent released and Open Source the HunYuan-A13B model. According to the introduction, as a large model based on the Mixture of Experts (MoE) architecture, it has a total of 80 billion parameters and 13 billion active parameters. While its performance is comparable to top open source models, it significantly drops inference latency and computational costs; under extreme conditions, it can be deployed with just one mid-range GPU card.