Microsoft's compact 0.6B multilingual embedder distilled for mid-tier and CPU deployments.
A strong 0.596B-parameter dense embedding model from Microsoft. Treat the modality benchmarks above as the leading indicator of fit — composite scoring across modalities is still maturing.
Generated from this model’s benchmarks and ranking signals. Editor reviews refine it over time.
Access model weights, configuration files, and documentation.
See which devices can run this model and at what quality level.
The mid-tier Harrier model from Microsoft Research, a Qwen3-0.6B-based decoder-only embedding model trained with contrastive learning plus knowledge distillation from larger teacher embedding models. Supports 94 languages and 32K-token contexts and produces 1024-dim embeddings, all under the MIT license.