Salesforce Research's 7B contrastive embedding that first topped MTEB in February 2024 (research-only).
A solid 7.1B-parameter dense embedding model from Salesforce. Treat the modality benchmarks above as the leading indicator of fit — composite scoring across modalities is still maturing.
Generated from this model’s benchmarks and ranking signals. Editor reviews refine it over time.
Access model weights, configuration files, and documentation.
See which devices can run this model and at what quality level.
A 7B Mistral-7B-based embedding model from Salesforce AI Research that enhances E5-Mistral via multi-task transfer learning, notably leveraging clustering data, and task-homogeneous batching to strengthen contrastive training with harder in-batch negatives. Was the first publicly released model to top the MTEB leaderboard (Feb 2024); released under a research-only CC-BY-NC-4.0 license.