Salesforce Research's 7B contrastive embedding that first topped MTEB in February 2024 (research-only).
Access model weights, configuration files, and documentation.
See which devices can run this model and at what quality level.
A 7B Mistral-7B-based embedding model from Salesforce AI Research that enhances E5-Mistral via multi-task transfer learning, notably leveraging clustering data, and task-homogeneous batching to strengthen contrastive training with harder in-batch negatives. Was the first publicly released model to top the MTEB leaderboard (Feb 2024); released under a research-only CC-BY-NC-4.0 license.