Qwen2.5-0.5B-Instruct is an instruction-tuned language model in the Qwen2.5 series, with approximately 500 million parameters. The main features of this model include:
Model Type: Causal Language Model
Training Stages: Pre-training and post-training
Architecture: Transformer, using RoPE, SwiGLU, RMSNorm, Attention QKV bias, and tied word embeddings
Number of Parameters: 490 million (360 million non-embedding parameters)
Number of Layers: 24 layers
Number of Attention Heads (GQA): 14 query heads, 2 key-value heads
Context Length: Supports a full 32,768-token context, with a maximum generation length of 8,192 tokens
This model shows significant improvements in instruction understanding, long-text generation, and structured data comprehension, and supports multilingual capabilities across 29 languages including English, Chinese, and French.