Google Cloud Ai Platform V1 Client - Class ComputeTokensRequest (1.35.0)

Reference documentation and code samples for the Google Cloud Ai Platform V1 Client class ComputeTokensRequest.

Request message for ComputeTokens RPC call.

Generated from protobuf message google.cloud.aiplatform.v1.ComputeTokensRequest

Namespace

Google \ Cloud \ AIPlatform \ V1

Methods

__construct

Constructor.

Parameters
Name
Description
data
array

Optional. Data for populating the Message object.

↳ endpoint
string

Required. The name of the Endpoint requested to get lists of tokens and token ids.

↳ instances
array< Google\Protobuf\Value >

Optional. The instances that are the input to token computing API call. Schema is identical to the prediction schema of the text model, even for the non-text models, like chat models, or Codey models.

↳ model
string

Optional. The name of the publisher model requested to serve the prediction. Format: projects/{project}/locations/{location}/publishers/ /models/

↳ contents
array< Content >

Optional. Input content.

getEndpoint

Required. The name of the Endpoint requested to get lists of tokens and token ids.

Returns
Type
Description
string

setEndpoint

Required. The name of the Endpoint requested to get lists of tokens and token ids.

Parameter
Name
Description
var
string
Returns
Type
Description
$this

getInstances

Optional. The instances that are the input to token computing API call.

Schema is identical to the prediction schema of the text model, even for the non-text models, like chat models, or Codey models.

Returns
Type
Description

setInstances

Optional. The instances that are the input to token computing API call.

Schema is identical to the prediction schema of the text model, even for the non-text models, like chat models, or Codey models.

Parameter
Name
Description
var
Returns
Type
Description
$this

getModel

Optional. The name of the publisher model requested to serve the prediction. Format: projects/{project}/locations/{location}/publishers/ /models/

Returns
Type
Description
string

setModel

Optional. The name of the publisher model requested to serve the prediction. Format: projects/{project}/locations/{location}/publishers/ /models/

Parameter
Name
Description
var
string
Returns
Type
Description
$this

getContents

Optional. Input content.

Returns
Type
Description

setContents

Optional. Input content.

Parameter
Name
Description
var
array< Content >
Returns
Type
Description
$this

static::build

Parameters
Name
Description
endpoint
string

Required. The name of the Endpoint requested to get lists of tokens and token ids. Please see LlmUtilityServiceClient::endpointName() for help formatting this field.

instances
array< Google\Protobuf\Value >

Optional. The instances that are the input to token computing API call. Schema is identical to the prediction schema of the text model, even for the non-text models, like chat models, or Codey models.

Returns
Type
Description
Create a Mobile Website
View Site in Mobile | Classic
Share by: