239M multilingual embedder distilled from Qwen3-Embedding-4B onto an EuroBERT backbone.
Access model weights, configuration files, and documentation.
See which devices can run this model and at what quality level.
The lightweight (239M-parameter) sibling of jina-embeddings-v5-text-small, built on EuroBERT-210M and distilled from Qwen3-Embedding-4B with task-specific LoRA adapters and contrastive losses. Targets resource-constrained and edge deployments while still matching or exceeding all other sub-500M embedding models on MTEB and MMTEB.