The QwQ reasoning model is significantly leaner than DeepSeek R1 and is said to be more powerful in many areas.
While DeepSeek-R1 operates with 671 billion parameters, QwQ-32B achieves comparable performance with a much smaller footprint ...
These reasoning models were designed to offer an open-source alternative for the likes of OpenAI's o1 series. The QwQ-32B is a 32 billion parameter model developed by scaling reinforcement learning ...
Hong-Kong listed shares (HK:9988) jumped over 8% today, after the company launched its new open-source AI (artificial ...
Qwen, the ecommerce leader's artificial intelligence unit, said on X that its QwQ-32B, with 32 billion parameters, can ...
Albibab Cloud’s latest model rivals much larger competitors with just 32 billion parameters in what it views as a critical ...
Alibaba Group's release of an artificial intelligence (AI) reasoning model, which it said was on par with global hit DeepSeek ...
Alibaba’s Cloud Intelligence Group is the key long-term growth driver, fueled by AI demand and rapid expansion. See why I ...
Alibaba launched new reasoning model comparable to DeepSeek's R1, pledged increased support for AI in China, and committed ...
Chinese tech giant Alibaba unveiled its latest artificial intelligence reasoning model on Thursday, boasting that its ...