Listen to this story

Chinese tech-giant Alibaba is desperately trying to save its cloud business with large language models (LLMs). After releasing Qwen-7B in October, Alibaba recently released Qwen-72B, which has been trained on high-quality data consisting of 3T tokens. Compared to the previous versions, this has a larger parameter size and also an expanded context window length of 32K, with more customisation capabilities.

Alibaba’s Cloud Business Gets Qwen-ched! comes via ChinaTechNews.com.