Multilingual E5 Large is part of the E5 family of text embedding models. This large variant contains 24 layers. This model is well-suited for tasks such as:
- Semantic Search: Finding documents or text passages that are semantically relevant to a query.
- Text Clustering: Grouping similar pieces of text based on their semantic meaning.
- Informative Response Generation: Following instructions to provide summaries, explanations, or answers to factual queries.
View model card in Model Garden
Model ID
 
 multilingual-e5-large-instruct-maas 
Launch stage
 
 GA
 
Supported inputs & outputs
 
 - Inputs: Text 
- Outputs: Embeddings 
Output dimensions
 
 Up to 1,024
 
Number of layers
 
 24
 
Max sequence length
 
 512 tokens
 
Supported languages
 
  
Usage types
 
 - Supported
- Not supported
Versions
 
 -  multilingual-e5-large
- Launch stage: GA
- Release date: September 19, 2025
Supported regions
 
Model availability
- United States
-  us-central1
- Europe
-  europe-west4
ML processing
- United States
-  Multi-region
- Europe
-  Multi-region
Quota
 
 us-central1:
- TPM: 3,000
europe-west4:
- TPM: 3,000

