Package cloud.google.com/go/vertexai/genai/tokenizer (v0.13.3)

Package tokenizer provides local token counting for Gemini models. This tokenizer downloads its model from the web, but otherwise doesn't require an API call for every [CountTokens] invocation.

CountTokensResponse

  type 
  
 CountTokensResponse 
  
 struct 
  
 { 
  
 TotalTokens 
  
  int32 
 
 } 
 

CountTokensResponse is the response of [Tokenizer.CountTokens].

Tokenizer

  type 
  
 Tokenizer 
  
 struct 
  
 { 
  
 // contains filtered or unexported fields 
 } 
 

Tokenizer is a local tokenizer for text.

func New

  func 
  
 New 
 ( 
 modelName 
  
  string 
 
 ) 
  
 ( 
 * 
  Tokenizer 
 
 , 
  
  error 
 
 ) 
 

New creates a new [Tokenizer] from a model name; the model name is the same as you would pass to a [genai.Client.GenerativeModel].

func (*Tokenizer) CountTokens

  func 
  
 ( 
 tok 
  
 * 
  Tokenizer 
 
 ) 
  
 CountTokens 
 ( 
 parts 
  
 ... 
  genai 
 
 . 
  Part 
 
 ) 
  
 ( 
 * 
  CountTokensResponse 
 
 , 
  
  error 
 
 ) 
 

CountTokens counts the tokens in all the given parts and returns their sum. Only [genai.Text] parts are suppored; an error will be returned if non-text parts are provided.

Example

  package 
  
 main 
 import 
  
 ( 
  
 "fmt" 
  
 "log" 
  
 "cloud.google.com/go/vertexai/genai" 
  
 "cloud.google.com/go/vertexai/genai/tokenizer" 
 ) 
 func 
  
 main 
 () 
  
 { 
  
 tok 
 , 
  
 err 
  
 := 
  
 tokenizer 
 . 
 New 
 ( 
 "gemini-1.5-flash" 
 ) 
  
 if 
  
 err 
  
 != 
  
 nil 
  
 { 
  
 log 
 . 
 Fatal 
 ( 
 err 
 ) 
  
 } 
  
 ntoks 
 , 
  
 err 
  
 := 
  
 tok 
 . 
 CountTokens 
 ( 
 genai 
 . 
  Text 
 
 ( 
 "a prompt" 
 ), 
  
 genai 
 . 
  Text 
 
 ( 
 "another prompt" 
 )) 
  
 if 
  
 err 
  
 != 
  
 nil 
  
 { 
  
 log 
 . 
 Fatal 
 ( 
 err 
 ) 
  
 } 
  
 fmt 
 . 
 Println 
 ( 
 "total token count:" 
 , 
  
 ntoks 
 . 
 TotalTokens 
 ) 
 } 
 
Create a Mobile Website
View Site in Mobile | Classic
Share by: