239M multilingual embedder distilled from Qwen3-Embedding-4B onto an EuroBERT backbone.
A solid 0.212B-parameter dense embedding model from Jina AI. Treat the modality benchmarks above as the leading indicator of fit — composite scoring across modalities is still maturing.
Generated from this model’s benchmarks and ranking signals. Editor reviews refine it over time.
Access model weights, configuration files, and documentation.
See which devices can run this model and at what quality level.
The lightweight (239M-parameter) sibling of jina-embeddings-v5-text-small, built on EuroBERT-210M and distilled from Qwen3-Embedding-4B with task-specific LoRA adapters and contrastive losses. Targets resource-constrained and edge deployments while still matching or exceeding all other sub-500M embedding models on MTEB and MMTEB.