InternVL 2.5 is a series of Multimodal Large Language Models (MLLMs) built on InternVL 2.0. While retaining the core model architecture, it introduces significant optimizations in training strategies, evaluation methods, and data quality.
The Base Model provides a context window of 256 and supports a maximum output of 1,024 tokens.
Supported Platforms: LLM630 Compute Kit, Module LLM, and Module LLM Kit
apt install llm-model-internvl2.5-1b-364-ax630c The Base Model provides a context window of 320 and supports a maximum output of 1,024 tokens.
Supported Platforms: AI Pyramid
apt install llm-model-internvl2.5-1b-448-ax650