Tutorial: Run inference on an object table by using a classification model

This tutorial shows you how to create an object table based on the images from a public dataset, and then run inference on that object table using the ResNet 50 model .

The ResNet 50 model

The ResNet 50 model analyzes image files and outputs a batch of vectors representing the likelihood that an image belongs the corresponding class (logits). For more information, see the Usagesection on the model's TensorFlow Hub page .

The ResNet 50 model input takes a tensor of DType = float32 in the shape [-1, 224, 224, 3] . The output is an array of tensors of tf.float32 in the shape [-1, 1024] .

Required permissions

  • To create the dataset, you need the bigquery.datasets.create permission.
  • To create the connection resource, you need the following permissions:

    • bigquery.connections.create
    • bigquery.connections.get
  • To grant permissions to the connection's service account, you need the following permission:

    • resourcemanager.projects.setIamPolicy
  • To create the object table, you need the following permissions:

    • bigquery.tables.create
    • bigquery.tables.update
    • bigquery.connections.delegate
  • To create the bucket, you need the storage.buckets.create permission.

  • To upload the model to Cloud Storage, you need the storage.objects.create and storage.objects.get permissions.

  • To load the model into BigQuery ML, you need the following permissions:

    • bigquery.jobs.create
    • bigquery.models.create
    • bigquery.models.getData
    • bigquery.models.updateData
  • To run inference, you need the following permissions:

    • bigquery.tables.getData on the object table
    • bigquery.models.getData on the model
    • bigquery.jobs.create

Costs

In this document, you use the following billable components of Google Cloud:

  • BigQuery : You incur storage costs for the object table you create in BigQuery.
  • BigQuery ML : You incur costs for the model you create and the inference you perform in BigQuery ML.
  • Cloud Storage : You incur costs for the objects you store in Cloud Storage.

To generate a cost estimate based on your projected usage, use the pricing calculator .

New Google Cloud users might be eligible for a free trial .

For more information on BigQuery storage pricing, see Storage pricing in the BigQuery documentation.

For more information on BigQuery ML pricing, see BigQuery ML pricing in the BigQuery documentation.

For more information on Cloud Storage pricing, see the Cloud Storage pricing page.

Before you begin

  1. Sign in to your Google Cloud account. If you're new to Google Cloud, create an account to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
  2. In the Google Cloud console, on the project selector page, select or create a Google Cloud project.

    Go to project selector

  3. Verify that billing is enabled for your Google Cloud project .

  4. Enable the BigQuery and BigQuery Connection API APIs.

    Enable the APIs

  5. In the Google Cloud console, on the project selector page, select or create a Google Cloud project.

    Go to project selector

  6. Verify that billing is enabled for your Google Cloud project .

  7. Enable the BigQuery and BigQuery Connection API APIs.

    Enable the APIs

Create a reservation

To use an imported model with an object table, you must create a reservation that uses the BigQuery Enterprise or Enterprise Plus edition , and then create a reservation assignment that uses the QUERY job type.

Create a dataset

Create a dataset named resnet_inference_test :

SQL

  1. Go to the BigQuerypage.

    Go to BigQuery

  2. In the Editorpane, run the following SQL statement:

     CREATE 
      
     SCHEMA 
      
     ` PROJECT_ID 
    .resnet_inference_test` 
     ; 
    

    Replace PROJECT_ID with your project ID.

bq

  1. In the Google Cloud console, activate Cloud Shell.

    Activate Cloud Shell

  2. Run the bq mk command to create the dataset:

    bq  
    mk  
    --dataset  
    --location = 
    us  
     PROJECT_ID 
    :resnet_inference_test

    Replace PROJECT_ID with your project ID.

Create a connection

Create a connection named lake-connection :

Console

  1. Go to the BigQuerypage.

    Go to BigQuery

  2. In the Explorerpane, click Add data.

    The Add datadialog opens.

  3. In the Filter Bypane, in the Data Source Typesection, select Databases.

    Alternatively, in the Search for data sourcesfield, you can enter Vertex AI .

  4. In the Featured data sourcessection, click Vertex AI.

  5. Click the Vertex AI Models: BigQuery Federationsolution card.

  6. In the Connection typelist, select Vertex AI remote models, remote functions and BigLake (Cloud Resource).

  7. In the Connection IDfield, type lake-connection .

  8. Click Create connection.

  9. In the Connection infopane, copy the value from the Service account idfield and save it somewhere. You need this information to grant permissions to the connection's service account.

bq

  1. In Cloud Shell, run the bq mk command to create the connection:

      bq 
      
     mk 
      
     -- 
     connection 
      
     -- 
     location 
     = 
     us 
      
     -- 
     connection_type 
     = 
     CLOUD_RESOURCE 
      
    \ lake 
     - 
     connection 
     
    
  2. Run the bq show command to retrieve information about the connection:

     bq show --connection us.lake-connection 
    
  3. From the properties column, copy the value of the serviceAccountId property and save it somewhere. You need this information to grant permissions to the connection's service account.

Create a Cloud Storage bucket

Create a Cloud Storage bucket to contain the model files.

Grant permissions to the connection's service account

Console

  1. Go to the IAM & Adminpage.

    Go to IAM & Admin

  2. Click Grant Access.

    The Add principalsdialog opens.

  3. In the New principalsfield, enter the service account ID that you copied earlier.

  4. In the Select a rolefield, select Cloud Storage, and then select Storage Object Viewer.

  5. Click Save.

gcloud

In Cloud Shell, run the gcloud storage buckets add-iam-policy-binding command :

gcloud  
storage  
buckets  
add-iam-policy-binding  
gs:// BUCKET_NAME 
  
 \ 
--member = 
serviceAccount: MEMBER 
  
 \ 
--role = 
roles/storage.objectViewer

Replace MEMBER with the service account ID that you copied earlier. Replace BUCKET_NAME with the name of the bucket you previously created.

For more information, see Add a principal to a bucket-level policy .

Create an object table

Create an object table named vision_images based on the image files in the public gs://cloud-samples-data/vision bucket:

SQL

  1. Go to the BigQuerypage.

    Go to BigQuery

  2. In the Editorpane, run the following SQL statement:

     CREATE 
      
     EXTERNAL 
      
     TABLE 
      
     resnet_inference_test 
     . 
     vision_images 
     WITH 
      
     CONNECTION 
      
     `us.lake-connection` 
     OPTIONS 
     ( 
      
     object_metadata 
      
     = 
      
     'SIMPLE' 
     , 
      
     uris 
      
     = 
      
     [ 
     'gs://cloud-samples-data/vision/*.jpg' 
     ] 
     ); 
    

bq

In Cloud Shell, run the bq mk command to create the connection:

  bq 
  
 mk 
  
 --table \ 
 --external_table_definition='gs://cloud-samples-data/vision/*.jpg@us.lake-connection' \ 
 --object_metadata=SIMPLE \ 
 resnet_inference_test 
 . 
 vision_images 
 

Upload the model to Cloud Storage

Get the model files and make them available in Cloud Storage:

  1. Download the ResNet 50 model to your local machine. This gives you a saved_model.pb file and a variables folder for the model.
  2. Upload the saved_model.pb file and the variables folder to the bucket you previously created.

Load the model into BigQuery ML

  1. Go to the BigQuerypage.

    Go to BigQuery

  2. In the Editorpane, run the following SQL statement:

     CREATE 
      
     MODEL 
      
     `resnet_inference_test.resnet` 
     OPTIONS 
     ( 
      
     model_type 
      
     = 
      
     'TENSORFLOW' 
     , 
      
     model_path 
      
     = 
      
     'gs:// BUCKET_NAME 
    /*' 
     ); 
    

    Replace BUCKET_NAME with the name of the bucket you previously created.

Inspect the model

Inspect the uploaded model to see what its input and output fields are:

  1. Go to the BigQuerypage.

    Go to BigQuery

  2. In the Explorer pane, expand your project, expand the resnet_inference_test dataset, and then expand the Modelsnode.

  3. Click the resnet model.

  4. In the model pane that opens, click the Schematab.

  5. Look at the Labelssection. This identifies the fields that are output by the model. In this case, the field name value is activation_49 .

  6. Look at the Featuressection. This identifies the fields that must be input into the model. You reference them in the SELECT statement for the ML.DECODE_IMAGE function. In this case, the field name value is input_1 .

Run inference

Run inference on the vision_images object table using the resnet model:

  1. Go to the BigQuerypage.

    Go to BigQuery

  2. In the Editorpane, run the following SQL statement:

     SELECT 
      
     * 
     FROM 
      
     ML 
     . 
     PREDICT 
     ( 
      
     MODEL 
      
     `resnet_inference_test.resnet` 
     , 
      
     ( 
     SELECT 
      
     uri 
     , 
      
     ML 
     . 
     RESIZE_IMAGE 
     ( 
     ML 
     . 
     DECODE_IMAGE 
     ( 
     data 
     ), 
      
     224 
     , 
      
     224 
     , 
      
     FALSE 
     ) 
      
     AS 
      
     input_1 
      
     FROM 
      
     resnet_inference_test 
     . 
     vision_images 
     ) 
     ); 
    

    The results should look similar to the following:

      
    | activation_49 | uri | input_1 |
    | 1 . 0254175464297077e - 07 | gs://cloud - samples - data/vision/automl_classification/flowers/daisy/21652746_cc379e0eea_m . jpg | 0 . 0 |
    | 2 . 1671139620593749e - 06 | | 0 . 0 |

    | 8 . 346052027263795e - 08 | | 0 . 0 |

    | 1 . 159310958342985e - 08 | | 0 . 0 |

Clean up

  1. In the Google Cloud console, go to the Manage resources page.

    Go to Manage resources

  2. In the project list, select the project that you want to delete, and then click Delete .
  3. In the dialog, type the project ID, and then click Shut down to delete the project.
Design a Mobile Site
View Site in Mobile | Classic
Share by: