Followin LogoFollowin
  • icon of HOMEicon of HOMEHome
  • icon of NEWSicon of NEWSNews
  • icon of EXPLOREicon of EXPLOREMarket
  • icon of CHANNEL_LISTicon of CHANNEL_LISTChannel
  • icon of TOPICicon of TOPICTopic
  • icon of FEATURE_LISTicon of FEATURE_LISTFeature
  • icon of BENEFIT_CENTERicon of BENEFIT_CENTERRewards Center
  • icon of ACCOUNTicon of ACCOUNTAccount
  • icon of LANGUAGEicon of LANGUAGELanguage
  • Light
  • icon of DOWNLOADicon of DOWNLOADDownload App
avatar
Log in
avatar
Qwen
2,910 Twitter followers
Follow
Open foundation models from @alibaba_cloud
Posts
avatar
Qwen
01-28
The burst of DeepSeek V3 has attracted attention from the whole AI community to large-scale MoE models. Concurrently, we have been building Qwen2.5-Max, a large MoE LLM pretrained on massive data and post-trained with curated SFT and RLHF recipes. It achieves competitive
avatar
Qwen
01-27
We're leveling up the game with our latest open-source models, Qwen2.5-1M ! 💥 Now supporting a 1 MILLION TOKEN CONTEXT LENGTH 🔥 Here's what’s new: 1️⃣ Open Models: Meet Qwen2.5-7B-Instruct-1M & Qwen2.5-14B-Instruct-1M —our first-ever models handling 1M-token contexts! 🤯 2️⃣
-- END --