Class ExperimentModel (1.35.0)

  ExperimentModel 
 ( 
 * 
 , 
 framework_name 
 : 
 str 
 , 
 framework_version 
 : 
 str 
 , 
 model_file 
 : 
 str 
 , 
 uri 
 : 
 str 
 , 
 model_class 
 : 
 typing 
 . 
 Optional 
 [ 
 str 
 ] 
 = 
 None 
 , 
 predict_schemata 
 : 
 typing 
 . 
 Optional 
 [ 
 google 
 . 
 cloud 
 . 
 aiplatform 
 . 
 metadata 
 . 
 schema 
 . 
 utils 
 . 
 PredictSchemata 
 ] 
 = 
 None 
 , 
 artifact_id 
 : 
 typing 
 . 
 Optional 
 [ 
 str 
 ] 
 = 
 None 
 , 
 display_name 
 : 
 typing 
 . 
 Optional 
 [ 
 str 
 ] 
 = 
 None 
 , 
 schema_version 
 : 
 typing 
 . 
 Optional 
 [ 
 str 
 ] 
 = 
 None 
 , 
 description 
 : 
 typing 
 . 
 Optional 
 [ 
 str 
 ] 
 = 
 None 
 , 
 metadata 
 : 
 typing 
 . 
 Optional 
 [ 
 typing 
 . 
 Dict 
 ] 
 = 
 None 
 , 
 state 
 : 
 typing 
 . 
 Optional 
 [ 
 google 
 . 
 cloud 
 . 
 aiplatform_v1 
 . 
 types 
 . 
 artifact 
 . 
 Artifact 
 . 
 State 
 ] 
 = 
 State 
 . 
 LIVE 
 ) 
 

An artifact representing a Vertex Experiment Model.

Properties

The framework name of the saved ML model.

The framework version of the saved ML model.

The class name of the saved ML model.

Methods

  ExperimentModel 
 ( 
 * 
 , 
 framework_name 
 : 
 str 
 , 
 framework_version 
 : 
 str 
 , 
 model_file 
 : 
 str 
 , 
 uri 
 : 
 str 
 , 
 model_class 
 : 
 typing 
 . 
 Optional 
 [ 
 str 
 ] 
 = 
 None 
 , 
 predict_schemata 
 : 
 typing 
 . 
 Optional 
 [ 
 google 
 . 
 cloud 
 . 
 aiplatform 
 . 
 metadata 
 . 
 schema 
 . 
 utils 
 . 
 PredictSchemata 
 ] 
 = 
 None 
 , 
 artifact_id 
 : 
 typing 
 . 
 Optional 
 [ 
 str 
 ] 
 = 
 None 
 , 
 display_name 
 : 
 typing 
 . 
 Optional 
 [ 
 str 
 ] 
 = 
 None 
 , 
 schema_version 
 : 
 typing 
 . 
 Optional 
 [ 
 str 
 ] 
 = 
 None 
 , 
 description 
 : 
 typing 
 . 
 Optional 
 [ 
 str 
 ] 
 = 
 None 
 , 
 metadata 
 : 
 typing 
 . 
 Optional 
 [ 
 typing 
 . 
 Dict 
 ] 
 = 
 None 
 , 
 state 
 : 
 typing 
 . 
 Optional 
 [ 
 google 
 . 
 cloud 
 . 
 aiplatform_v1 
 . 
 types 
 . 
 artifact 
 . 
 Artifact 
 . 
 State 
 ] 
 = 
 State 
 . 
 LIVE 
 ) 
 

Instantiates an ExperimentModel that represents a saved ML model.

Parameters
Name
Description
framework_name
str

Required. The name of the model's framework. E.g., 'sklearn'

framework_version
str

Required. The version of the model's framework. E.g., '1.1.0'

model_file
str

Required. The file name of the model. E.g., 'model.pkl'

uri
str

Required. The uniform resource identifier of the model artifact directory.

model_class
str

Optional. The class name of the model. E.g., 'sklearn.linear_model._base.LinearRegression'

predict_schemata
PredictSchemata

Optional. An instance of PredictSchemata which holds instance, parameter and prediction schema uris.

artifact_id
str

Optional. The <resource_id> portion of the Artifact name with the format. This is globally unique in a metadataStore: projects/123/locations/us-central1/metadataStores/<metadata_store_id>/artifacts/<resource_id>.

display_name
str

Optional. The user-defined name of the Artifact.

schema_version
str

Optional. schema_version specifies the version used by the Artifact. If not set, defaults to use the latest version.

description
str

Optional. Describes the purpose of the Artifact to be created.

metadata
Dict

Optional. Contains the metadata information that will be stored in the Artifact.

state
google.cloud.gapic.types.Artifact.State

Optional. The state of this Artifact. This is a property of the Artifact, and does not imply or apture any ongoing process. This property is managed by clients (such as Vertex AI Pipelines), and the system does not prescribe or check the validity of state transitions.

  get 
 ( 
 artifact_id 
 : 
 str 
 , 
 * 
 , 
 metadata_store_id 
 : 
 str 
 = 
" default 
" , 
 project 
 : 
 typing 
 . 
 Optional 
 [ 
 str 
 ] 
 = 
 None 
 , 
 location 
 : 
 typing 
 . 
 Optional 
 [ 
 str 
 ] 
 = 
 None 
 , 
 credentials 
 : 
 typing 
 . 
 Optional 
 [ 
 google 
 . 
 auth 
 . 
 credentials 
 . 
 Credentials 
 ] 
 = 
 None 
 ) 
 - 
> google 
 . 
 cloud 
 . 
 aiplatform 
 . 
 metadata 
 . 
 schema 
 . 
 google 
 . 
 artifact_schema 
 . 
 ExperimentModel 
 

Retrieves an existing ExperimentModel artifact given an artifact id.

Parameters
Name
Description
artifact_id
str

Required. An artifact id of the ExperimentModel artifact.

metadata_store_id
str

Optional. MetadataStore to retrieve Artifact from. If not set, metadata_store_id is set to "default". If artifact_id is a fully-qualified resource name, its metadata_store_id overrides this one.

project
str

Optional. Project to retrieve the artifact from. If not set, project set in aiplatform.init will be used.

location
str

Optional. Location to retrieve the Artifact from. If not set, location set in aiplatform.init will be used.

credentials
auth_credentials.Credentials

Optional. Custom credentials to use to retrieve this Artifact. Overrides credentials set in aiplatform.init.

Exceptions
Type
Description
ValueError
if artifact's schema title is not 'google.ExperimentModel'.
  get_model_info 
 () 
 - 
> typing 
 . 
 Dict 
 [ 
 str 
 , 
 typing 
 . 
 Any 
 ] 
 

Get the model's info from an experiment model artifact.

  load_model 
 () 
 - 
> typing 
 . 
 Union 
 [ 
 sklearn 
 . 
 base 
 . 
 BaseEstimator 
 , 
 xgb 
 . 
 Booster 
 , 
 tf 
 . 
 Module 
 ] 
 

Retrieves the original ML model from an ExperimentModel.

Example Usage:

 experiment_model = aiplatform.get_experiment_model("my-sklearn-model")
sk_model = experiment_model.load_model()
pred_y = model.predict(test_X) 
Exceptions
Type
Description
ValueError
if model type is not supported.
  register_model 
 ( 
 * 
 , 
 model_id 
 : 
 typing 
 . 
 Optional 
 [ 
 str 
 ] 
 = 
 None 
 , 
 parent_model 
 : 
 typing 
 . 
 Optional 
 [ 
 str 
 ] 
 = 
 None 
 , 
 use_gpu 
 : 
 bool 
 = 
 False 
 , 
 is_default_version 
 : 
 bool 
 = 
 True 
 , 
 version_aliases 
 : 
 typing 
 . 
 Optional 
 [ 
 typing 
 . 
 Sequence 
 [ 
 str 
 ]] 
 = 
 None 
 , 
 version_description 
 : 
 typing 
 . 
 Optional 
 [ 
 str 
 ] 
 = 
 None 
 , 
 display_name 
 : 
 typing 
 . 
 Optional 
 [ 
 str 
 ] 
 = 
 None 
 , 
 description 
 : 
 typing 
 . 
 Optional 
 [ 
 str 
 ] 
 = 
 None 
 , 
 labels 
 : 
 typing 
 . 
 Optional 
 [ 
 typing 
 . 
 Dict 
 [ 
 str 
 , 
 str 
 ]] 
 = 
 None 
 , 
 serving_container_image_uri 
 : 
 typing 
 . 
 Optional 
 [ 
 str 
 ] 
 = 
 None 
 , 
 serving_container_predict_route 
 : 
 typing 
 . 
 Optional 
 [ 
 str 
 ] 
 = 
 None 
 , 
 serving_container_health_route 
 : 
 typing 
 . 
 Optional 
 [ 
 str 
 ] 
 = 
 None 
 , 
 serving_container_command 
 : 
 typing 
 . 
 Optional 
 [ 
 typing 
 . 
 Sequence 
 [ 
 str 
 ]] 
 = 
 None 
 , 
 serving_container_args 
 : 
 typing 
 . 
 Optional 
 [ 
 typing 
 . 
 Sequence 
 [ 
 str 
 ]] 
 = 
 None 
 , 
 serving_container_environment_variables 
 : 
 typing 
 . 
 Optional 
 [ 
 typing 
 . 
 Dict 
 [ 
 str 
 , 
 str 
 ] 
 ] 
 = 
 None 
 , 
 serving_container_ports 
 : 
 typing 
 . 
 Optional 
 [ 
 typing 
 . 
 Sequence 
 [ 
 int 
 ]] 
 = 
 None 
 , 
 instance_schema_uri 
 : 
 typing 
 . 
 Optional 
 [ 
 str 
 ] 
 = 
 None 
 , 
 parameters_schema_uri 
 : 
 typing 
 . 
 Optional 
 [ 
 str 
 ] 
 = 
 None 
 , 
 prediction_schema_uri 
 : 
 typing 
 . 
 Optional 
 [ 
 str 
 ] 
 = 
 None 
 , 
 explanation_metadata 
 : 
 typing 
 . 
 Optional 
 [ 
 google 
 . 
 cloud 
 . 
 aiplatform_v1 
 . 
 types 
 . 
 explanation_metadata 
 . 
 ExplanationMetadata 
 ] 
 = 
 None 
 , 
 explanation_parameters 
 : 
 typing 
 . 
 Optional 
 [ 
 google 
 . 
 cloud 
 . 
 aiplatform_v1 
 . 
 types 
 . 
 explanation 
 . 
 ExplanationParameters 
 ] 
 = 
 None 
 , 
 project 
 : 
 typing 
 . 
 Optional 
 [ 
 str 
 ] 
 = 
 None 
 , 
 location 
 : 
 typing 
 . 
 Optional 
 [ 
 str 
 ] 
 = 
 None 
 , 
 credentials 
 : 
 typing 
 . 
 Optional 
 [ 
 google 
 . 
 auth 
 . 
 credentials 
 . 
 Credentials 
 ] 
 = 
 None 
 , 
 encryption_spec_key_name 
 : 
 typing 
 . 
 Optional 
 [ 
 str 
 ] 
 = 
 None 
 , 
 staging_bucket 
 : 
 typing 
 . 
 Optional 
 [ 
 str 
 ] 
 = 
 None 
 , 
 sync 
 : 
 typing 
 . 
 Optional 
 [ 
 bool 
 ] 
 = 
 True 
 , 
 upload_request_timeout 
 : 
 typing 
 . 
 Optional 
 [ 
 float 
 ] 
 = 
 None 
 ) 
 - 
> google 
 . 
 cloud 
 . 
 aiplatform 
 . 
 models 
 . 
 Model 
 

Register an ExperimentModel to Model Registry and returns a Model representing the registered Model resource.

Example Usage:

 experiment_model = aiplatform.get_experiment_model("my-sklearn-model")
registered_model = experiment_model.register_model()
registered_model.deploy(endpoint=my_endpoint) 
Parameters
Name
Description
model_id
str

Optional. The ID to use for the registered Model, which will become the final component of the model resource name. This value may be up to 63 characters, and valid characters are [a-z0-9_-] . The first character cannot be a number or hyphen.

parent_model
str

Optional. The resource name or model ID of an existing model that the newly-registered model will be a version of. Only set this field when uploading a new version of an existing model.

use_gpu
str

Optional. Whether or not to use GPUs for the serving container. Only specify this argument when registering a Tensorflow model and 'serving_container_image_uri' is not specified.

is_default_version
bool

Optional. When set to True, the newly registered model version will automatically have alias "default" included. Subsequent uses of this model without a version specified will use this "default" version. When set to False, the "default" alias will not be moved. Actions targeting the newly-registered model version will need to specifically reference this version by ID or alias. New model uploads, i.e. version 1, will always be "default" aliased.

version_aliases
Sequence[str]

Optional. User provided version aliases so that a model version can be referenced via alias instead of auto-generated version ID. A default version alias will be created for the first version of the model. The format is a-z][a-zA-Z0-9-] {0,126}[a-z0-9]

version_description
str

Optional. The description of the model version being uploaded.

display_name
str

Optional. The display name of the Model. The name can be up to 128 characters long and can be consist of any UTF-8 characters.

description
str

Optional. The description of the model.

labels
Dict[str, str]

Optional. The labels with user-defined metadata to organize your Models. Label keys and values can be no longer than 64 characters (Unicode codepoints), can only contain lowercase letters, numeric characters, underscores and dashes. International characters are allowed. See https://goo.gl/xmQnxf for more information and examples of labels.

serving_container_image_uri
str

Optional. The URI of the Model serving container. A pre-built container https://cloud.google.com/vertex-ai/docs/predictions/pre-built-containers is automatically chosen based on the model's framwork. Set this field to override the default pre-built container.

serving_container_predict_route
str

Optional. An HTTP path to send prediction requests to the container, and which must be supported by it. If not specified a default HTTP path will be used by Vertex AI.

serving_container_health_route
str

Optional. An HTTP path to send health check requests to the container, and which must be supported by it. If not specified a standard HTTP path will be used by Vertex AI.

serving_container_command
Sequence[str]

Optional. The command with which the container is run. Not executed within a shell. The Docker image's ENTRYPOINT is used if this is not provided. Variable references $(VAR_NAME) are expanded using the container's environment. If a variable cannot be resolved, the reference in the input string will be unchanged. The $(VAR_NAME) syntax can be escaped with a double $$, ie: $$(VAR_NAME). Escaped references will never be expanded, regardless of whether the variable exists or not.

serving_container_args
Sequence[str]

Optional. The arguments to the command. The Docker image's CMD is used if this is not provided. Variable references $(VAR_NAME) are expanded using the container's environment. If a variable cannot be resolved, the reference in the input string will be unchanged. The $(VAR_NAME) syntax can be escaped with a double $$, ie: $$(VAR_NAME). Escaped references will never be expanded, regardless of whether the variable exists or not.

serving_container_environment_variables
Dict[str, str]

Optional. The environment variables that are to be present in the container. Should be a dictionary where keys are environment variable names and values are environment variable values for those names.

serving_container_ports
Sequence[int]

Optional. Declaration of ports that are exposed by the container. This field is primarily informational, it gives Vertex AI information about the network connections the container uses. Listing or not a port here has no impact on whether the port is actually exposed, any port listening on the default "0.0.0.0" address inside a container will be accessible from the network.

instance_schema_uri
str

Optional. Points to a YAML file stored on Google Cloud Storage describing the format of a single instance, which are used in PredictRequest.instances , ExplainRequest.instances and BatchPredictionJob.input_config . The schema is defined as an OpenAPI 3.0.2 Schema Object https://tinyurl.com/y538mdwt#schema-object __. AutoML Models always have this field populated by AI Platform. Note: The URI given on output will be immutable and probably different, including the URI scheme, than the one given on input. The output URI will point to a location where the user only has a read access.

parameters_schema_uri
str

Optional. Points to a YAML file stored on Google Cloud Storage describing the parameters of prediction and explanation via PredictRequest.parameters , ExplainRequest.parameters and BatchPredictionJob.model_parameters . The schema is defined as an OpenAPI 3.0.2 Schema Object https://tinyurl.com/y538mdwt#schema-object __. AutoML Models always have this field populated by AI Platform, if no parameters are supported it is set to an empty string. Note: The URI given on output will be immutable and probably different, including the URI scheme, than the one given on input. The output URI will point to a location where the user only has a read access.

prediction_schema_uri
str

Optional. Points to a YAML file stored on Google Cloud Storage describing the format of a single prediction produced by this Model, which are returned via PredictResponse.predictions , ExplainResponse.explanations , and BatchPredictionJob.output_config . The schema is defined as an OpenAPI 3.0.2 Schema Object https://tinyurl.com/y538mdwt#schema-object __. AutoML Models always have this field populated by AI Platform. Note: The URI given on output will be immutable and probably different, including the URI scheme, than the one given on input. The output URI will point to a location where the user only has a read access.

explanation_metadata
aiplatform.explain.ExplanationMetadata

Optional. Metadata describing the Model's input and output for explanation. explanation_metadata is optional while explanation_parameters must be specified when used. For more details, see Ref docs http://tinyurl.com/1igh60kt

explanation_parameters
aiplatform.explain.ExplanationParameters

Optional. Parameters to configure explaining for Model's predictions. For more details, see Ref docs http://tinyurl.com/1an4zake

encryption_spec_key_name
Optional[str]

Optional. The Cloud KMS resource identifier of the customer managed encryption key used to protect the model. Has the form projects/my-project/locations/my-region/keyRings/my-kr/cryptoKeys/my-key . The key needs to be in the same region as where the compute resource is created. If set, this Model and all sub-resources of this Model will be secured by this key. Overrides encryption_spec_key_name set in aiplatform.init.

staging_bucket
str

Optional. Bucket to stage local model artifacts. Overrides staging_bucket set in aiplatform.init.

sync
bool

Optional. Whether to execute this method synchronously. If False, this method will unblock and it will be executed in a concurrent Future.

upload_request_timeout
float

Optional. The timeout for the upload request in seconds.

Exceptions
Type
Description
ValueError
If the model doesn't have a pre-built container that is suitable for its framework and 'serving_container_image_uri' is not set.
Returns
Type
Description
model (aiplatform.Model)
Instantiated representation of the registered model resource.