Module language_models (1.25.0)

Classes for working with language models.

Classes

ChatModel

  ChatModel 
 ( 
 model_id 
 : 
 str 
 , 
 endpoint_name 
 : 
 Optional 
 [ 
 str 
 ] 
 = 
 None 
 ) 
 

ChatModel represents a language model that is capable of chat.

.. rubric:: Examples

chat_model = ChatModel.from_pretrained("chat-bison@001")

chat = chat_model.start_chat( context="My name is Ned. You are my personal assistant. My favorite movies are Lord of the Rings and Hobbit.", examples=[ InputOutputTextPair( input_text="Who do you work for?", output_text="I work for Ned.", ), InputOutputTextPair( input_text="What do I like?", output_text="Ned likes watching movies.", ), ], temperature=0.3, )

chat.send_message("Do you know any cool events this weekend?")

ChatSession

  ChatSession 
 ( 
 model 
 : 
 vertexai 
 . 
 language_models 
 . 
 _language_models 
 . 
 ChatModel 
 , 
 context 
 : 
 Optional 
 [ 
 str 
 ] 
 = 
 None 
 , 
 examples 
 : 
 Optional 
 [ 
 List 
 [ 
 vertexai 
 . 
 language_models 
 . 
 _language_models 
 . 
 InputOutputTextPair 
 ] 
 ] 
 = 
 None 
 , 
 max_output_tokens 
 : 
 int 
 = 
 128 
 , 
 temperature 
 : 
 float 
 = 
 0.0 
 , 
 top_k 
 : 
 int 
 = 
 40 
 , 
 top_p 
 : 
 float 
 = 
 0.95 
 , 
 ) 
 

ChatSession represents a chat session with a language model.

Within a chat session, the model keeps context and remembers the previous conversation.

InputOutputTextPair

  InputOutputTextPair 
 ( 
 input_text 
 : 
 str 
 , 
 output_text 
 : 
 str 
 ) 
 

InputOutputTextPair represents a pair of input and output texts.

TextEmbedding

  TextEmbedding 
 ( 
 values 
 : 
 List 
 [ 
 float 
 ], 
 _prediction_response 
 : 
 Optional 
 [ 
 Any 
 ] 
 = 
 None 
 ) 
 

Contains text embedding vector.

TextEmbeddingModel

  TextEmbeddingModel 
 ( 
 model_id 
 : 
 str 
 , 
 endpoint_name 
 : 
 Optional 
 [ 
 str 
 ] 
 = 
 None 
 ) 
 

TextEmbeddingModel converts text into a vector of floating-point numbers.

.. rubric:: Examples

Getting embedding:

model = TextEmbeddingModel.from_pretrained("embedding-gecko@001") embeddings = model.get_embeddings(["What is life?"]) for embedding in embeddings: vector = embedding.values print(len(vector))

TextGenerationModel

  TextGenerationModel 
 ( 
 model_id 
 : 
 str 
 , 
 endpoint_name 
 : 
 Optional 
 [ 
 str 
 ] 
 = 
 None 
 ) 
 

TextGenerationModel represents a general language model.

.. rubric:: Examples

Getting answers:

model = TextGenerationModel.from_pretrained("text-bison@001") model.predict("What is life?")

TextGenerationResponse

  TextGenerationResponse 
 ( 
 text 
 : 
 str 
 , 
 _prediction_response 
 : 
 Any 
 ) 
 

TextGenerationResponse represents a response of a language model.

Create a Mobile Website
View Site in Mobile | Classic
Share by: