- 2.17.0 (latest)
- 2.16.0
- 2.15.0
- 2.14.0
- 2.13.0
- 2.12.0
- 2.11.0
- 2.10.0
- 2.9.0
- 2.8.0
- 2.7.0
- 2.6.0
- 2.5.0
- 2.4.0
- 2.3.0
- 2.2.0
- 2.0.0-dev0
- 1.36.0
- 1.35.0
- 1.34.0
- 1.33.0
- 1.32.0
- 1.31.0
- 1.30.0
- 1.29.0
- 1.28.0
- 1.27.0
- 1.26.0
- 1.25.0
- 1.24.0
- 1.22.0
- 1.21.0
- 1.20.0
- 1.19.0
- 1.18.0
- 1.17.0
- 1.16.0
- 1.15.0
- 1.14.0
- 1.13.0
- 1.12.0
- 1.11.1
- 1.10.0
- 1.9.0
- 1.8.0
- 1.7.0
- 1.6.0
- 1.5.0
- 1.4.0
- 1.3.0
- 1.2.0
- 1.1.0
- 1.0.0
- 0.26.0
- 0.25.0
- 0.24.0
- 0.23.0
- 0.22.0
- 0.21.0
- 0.20.1
- 0.19.2
- 0.18.0
- 0.17.0
- 0.16.0
- 0.15.0
- 0.14.1
- 0.13.0
- 0.12.0
- 0.11.0
- 0.10.0
- 0.9.0
- 0.8.0
- 0.7.0
- 0.6.0
- 0.5.0
- 0.4.0
- 0.3.0
- 0.2.0
LLM models.
Classes
Claude3TextGenerator
Claude3TextGenerator
(
*
,
model_name
:
typing
.
Literal
[
"claude-3-sonnet"
,
"claude-3-haiku"
,
"claude-3-5-sonnet"
,
"claude-3-opus"
]
=
"claude-3-sonnet"
,
session
:
typing
.
Optional
[
bigframes
.
session
.
Session
]
=
None
,
connection_name
:
typing
.
Optional
[
str
]
=
None
)
Claude3 text generator LLM model.
Go to Google Cloud Console -> Vertex AI -> Model Garden page to enabe the models before use. Must have the Consumer Procurement Entitlement Manager Identity and Access Management (IAM) role to enable the models. https://cloud.google.com/vertex-ai/generative-ai/docs/partner-models/use-partner-models#grant-permissions
The models only available in specific regions. Check https://cloud.google.com/vertex-ai/generative-ai/docs/partner-models/use-claude#regions for details.model_name
str, Default to "claude-3-sonnet"
The model for natural language tasks. Possible values are "claude-3-sonnet", "claude-3-haiku", "claude-3-5-sonnet" and "claude-3-opus". "claude-3-sonnet" is Anthropic's dependable combination of skills and speed. It is engineered to be dependable for scaled AI deployments across a variety of use cases. "claude-3-haiku" is Anthropic's fastest, most compact vision and text model for near-instant responses to simple queries, meant for seamless AI experiences mimicking human interactions. "claude-3-5-sonnet" is Anthropic's most powerful AI model and maintains the speed and cost of Claude 3 Sonnet, which is a mid-tier model. "claude-3-opus" is Anthropic's second-most powerful AI model, with strong performance on highly complex tasks. https://cloud.google.com/vertex-ai/generative-ai/docs/partner-models/use-claude#available-claude-models Default to "claude-3-sonnet".
session
bigframes.Session or None
BQ session to create the model. If None, use the global default session.
connection_name
str or None
Connection to connect with remote service. str of the format <PROJECT_NUMBER/PROJECT_ID>.
GeminiTextGenerator
GeminiTextGenerator
(
*
,
model_name
:
typing
.
Literal
[
"gemini-pro"
,
"gemini-1.5-pro-preview-0514"
,
"gemini-1.5-flash-preview-0514"
,
"gemini-1.5-pro-001"
,
"gemini-1.5-pro-002"
,
"gemini-1.5-flash-001"
,
"gemini-1.5-flash-002"
,
]
=
"gemini-pro"
,
session
:
typing
.
Optional
[
bigframes
.
session
.
Session
]
=
None
,
connection_name
:
typing
.
Optional
[
str
]
=
None
,
max_iterations
:
int
=
300
)
Gemini text generator LLM model.
model_name
str, Default to "gemini-pro"
The model for natural language tasks. Accepted values are "gemini-pro", "gemini-1.5-pro-preview-0514", "gemini-1.5-flash-preview-0514", "gemini-1.5-pro-001", "gemini-1.5-pro-002", "gemini-1.5-flash-001" and "gemini-1.5-flash-002". Default to "gemini-pro".
session
bigframes.Session or None
BQ session to create the model. If None, use the global default session.
connection_name
str or None
Connection to connect with remote service. str of the format <PROJECT_NUMBER/PROJECT_ID>.
max_iterations
Optional[int], Default to 300
The number of steps to run when performing supervised tuning.
PaLM2TextEmbeddingGenerator
PaLM2TextEmbeddingGenerator
(
*
,
model_name
:
typing
.
Literal
[
"textembedding-gecko"
,
"textembedding-gecko-multilingual"
]
=
"textembedding-gecko"
,
version
:
typing
.
Optional
[
str
]
=
None
,
session
:
typing
.
Optional
[
bigframes
.
session
.
Session
]
=
None
,
connection_name
:
typing
.
Optional
[
str
]
=
None
)
PaLM2 text embedding generator LLM model.
model_name
str, Default to "textembedding-gecko"
The model for text embedding. “textembedding-gecko” returns model embeddings for text inputs. "textembedding-gecko-multilingual" returns model embeddings for text inputs which support over 100 languages. Default to "textembedding-gecko".
version
str or None
Model version. Accepted values are "001", "002", "003", "latest" etc. Will use the default version if unset. See https://cloud.google.com/vertex-ai/docs/generative-ai/learn/model-versioning for details.
session
bigframes.Session or None
BQ session to create the model. If None, use the global default session.
connection_name
str or None
Connection to connect with remote service. str of the format <PROJECT_NUMBER/PROJECT_ID>.
PaLM2TextGenerator
PaLM2TextGenerator
(
*
,
model_name
:
typing
.
Literal
[
"text-bison"
,
"text-bison-32k"
]
=
"text-bison"
,
session
:
typing
.
Optional
[
bigframes
.
session
.
Session
]
=
None
,
connection_name
:
typing
.
Optional
[
str
]
=
None
,
max_iterations
:
int
=
300
)
PaLM2 text generator LLM model.
model_name
str, Default to "text-bison"
The model for natural language tasks. “text-bison” returns model fine-tuned to follow natural language instructions and is suitable for a variety of language tasks. "text-bison-32k" supports up to 32k tokens per request. Default to "text-bison".
session
bigframes.Session or None
BQ session to create the model. If None, use the global default session.
connection_name
str or None
Connection to connect with remote service. str of the format <PROJECT_NUMBER/PROJECT_ID>.
max_iterations
Optional[int], Default to 300
The number of steps to run when performing supervised tuning.
TextEmbeddingGenerator
TextEmbeddingGenerator
(
*
,
model_name
:
typing
.
Literal
[
"text-embedding-004"
,
"text-multilingual-embedding-002"
]
=
"text-embedding-004"
,
session
:
typing
.
Optional
[
bigframes
.
session
.
Session
]
=
None
,
connection_name
:
typing
.
Optional
[
str
]
=
None
)
Text embedding generator LLM model.
model_name
str, Default to "text-embedding-004"
The model for text embedding. Possible values are "text-embedding-004" or "text-multilingual-embedding-002". text-embedding models returns model embeddings for text inputs. text-multilingual-embedding models returns model embeddings for text inputs which support over 100 languages. Default to "text-embedding-004".
session
bigframes.Session or None
BQ session to create the model. If None, use the global default session.
connection_name
str or None
Connection to connect with remote service. str of the format <PROJECT_NUMBER/PROJECT_ID>.