Alaba's QWEN AI releases Complen Smcence Spen3-VL 4B / 8B (teaching and thinking) with FP8 Checkpoints

Do you actually need a great VLM when there is a great VLM when you are qwen3-VL 4B / 8b (command / think) with FP8 Run on the low viram but keeps 256k → 1m Alaba's Qwen's group extend its multimodal line with dense QWen3-VL models at 4b including Sentences Scale, each submission from two work profiles-Coach including Speculate-Plus FP8-too much The assessment of the lower shipping of the Vram. Drop arrive as a small epistle, a great friendship in 30b (MOE) issued 30b (MOE) and 235b (MOE) Tiers and keeps the same location

What is in the release?
Skus and variations: New Addors include four trilled models-QWEN3-VL-4B including QWEN3-VL-8BEach one in Coach including Speculate Versions – side by side FP8 4B / 8B versions teach and consumption of assessment. The official announcement is clearly independent of these as compact “Compact, Crusted” types of Vram and full QWen3-VL skills stored.
Length of the bound and the place of ability: List of model cards National 256k the context with Expanding in 1mThen write down the full feature: Long Document and Video Understanding, 32-language OCRThe basis of 2D / 3D area, visible codes, and management of Agentic Gui on the desktop and cellphone. These qualities continue to go to 4B / 8B Skus.
Building Notes: QWen3-VL highlights three important updates: MROPED-MROPE Installation in the power of the rising power over time / wide range / height (long long video), Depth With Vit-Level-Level of Vit-Level of the text texts, and Scriptural alignment – Timestamp On the other side of the video performance t-line. This design details are from new cards and, signing the continuation of buildings to continue the size.
Project Timeline: QWEN3-VL GTHUB SECTION “News” We record the publication of QWEN3-VL-4B (correct / Thinking) including QWEN3-VL-8B (command / think) despite of- Oct 15, 2025Following the previous release of 30b MOE Tier and Organi-Wide FP8.


FP8: Details specified
Numbers and claims of hostility: The FP8 Repositories status The value of the bent FP8 in the size of the block 128by The operating metrics are almost like the original BF16 checkpoints. In groups that test the Precision Policy with multimorder stacks (enced desks, versators, long care), the caravations generated by merchants) reduces the reduction and load verification.
Shopping status: The 4b-in education-fp8 Card cards said Changers that are as much as these FP8 metal instrumentsand commend vllm or Shang by serving; The card includes operating abbreviations. Separately, Vllms recipes Guide recommends FP8 look at H100 memory performance. Together, these are now in quick, supported techniques.
Healed Key
- Leg'gen has been released dense QWEN3-VL 4b including Sentences Models, each in Coach including Speculate variations, with FP8 checkpoints.
- FP8 Using a well-organized FP8 (the size of the block 128) with near BF16 metric; Converts Loading is not supported – use VLLM / Sglang.
- Powerful area is kept: 256k → 1m The context, 32-language OCRLocal foundation, video consultation, and GUI / Agent Control.
- Size of the model cards: QWEN3-VL-4B ≈ 4.83b pararmants; QWEN3-VL-8B-STIED ≈ 8.77b pararmes.
QWen's decision to send QWEN3-VL 4B / 8B to both types of teaching and consideration.
Look The model in the kisses of face including Githob Repo. Feel free to look our GITHUB page for tutorials, codes and letters of writing. Also, feel free to follow it Sane and don't forget to join ours 100K + ml subreddit Then sign up for Our newspaper. Wait! Do you with a telegram? Now you can join us with a telegram.
Asphazzaq is a Markteach Media Inc. According to a View Business and Developer, Asifi is committed to integrating a good social intelligence. His latest attempt is launched by the launch of the chemistrylife plan for an intelligence, MarktechPost, a devastating intimate practice of a machine learning and deep learning issues that are clearly and easily understood. The platform is adhering to more than two million moon visits, indicating its popularity between the audience.
Follow MarkteachPost: We have added like a favorite source to Google.



