Analyze multimodal data in Python with BigQuery DataFrames
This tutorial shows you how to analyze multimodal data in a Python notebook by using BigQuery DataFrames classes and methods.
This tutorial uses the product catalog from the public Cymbal pet store dataset.
To upload a notebook already populated with the tasks covered in this tutorial, see BigFrames Multimodal DataFrame .
Objectives
- Create multimodal DataFrames.
- Combine structured and unstructured data in a DataFrame.
- Transform images.
- Generate text and embeddings based on image data.
- Chunk PDFs for further analysis.
Costs
In this document, you use the following billable components of Google Cloud:
- BigQuery : you incur costs for the data that you process in BigQuery.
- BigQuery Python UDFs : you incur costs for using BigQuery DataFrames image transformation and chunk PDF methods.
- Cloud Storage : you incur costs for the objects stored in Cloud Storage.
- Vertex AI : you incur costs for calls to Vertex AI models.
To generate a cost estimate based on your projected usage, use the pricing calculator .
For more information about, see the following pricing pages:
Before you begin
-
In the Google Cloud console, on the project selector page, select or create a Google Cloud project.
-
Verify that billing is enabled for your Google Cloud project .
-
Enable the BigQuery, BigQuery Connection, Cloud Storage, and Vertex AI APIs.
Required roles
To get the permissions that you need to complete this tutorial, ask your administrator to grant you the following IAM roles:
- Create a connection: BigQuery Connection Admin
(
roles/bigquery.connectionAdmin
) - Grant permissions to the connection's service account: Project IAM Admin
(
roles/resourcemanager.projectIamAdmin
) - Create a Cloud Storage bucket: Storage Admin
(
roles/storage.admin
) - Run BigQuery jobs: BigQuery User
(
roles/bigquery.user
) - Create and call Python UDFs: BigQuery Data Editor
(
roles/bigquery.dataEditor
) - Create URLs that let you read and modify Cloud Storage objects: BigQuery ObjectRef Admin
(
roles/bigquery.objectRefAdmin
) - Use notebooks:
- BigQuery Read Session User
(
roles/bigquery.readSessionUser
) - Notebook Runtime User
(
roles/aiplatform.notebookRuntimeUser
) - Notebook Runtime User
(
roles/aiplatform.notebookRuntimeUser
) - Code Creator
(
roles/dataform.codeCreator
)
- BigQuery Read Session User
(
For more information about granting roles, see Manage access to projects, folders, and organizations .
You might also be able to get the required permissions through custom roles or other predefined roles .
Set up
In this section, you create the Cloud Storage bucket, connection, and notebook used in this tutorial.
Create a bucket
Create a Cloud Storage bucket for storing transformed objects:
-
In the Google Cloud console, go to the Bucketspage.
-
Click Create.
-
On the Create a bucketpage, in the Get startedsection, enter a globally unique name that meets the bucket name requirements .
-
Click Create.
Create a connection
Create a Cloud resource connection and get the connection's service account. BigQuery uses the connection to access objects in Cloud Storage.
-
Go to the BigQuerypage.
-
In the Explorerpane, click Add data.
The Add datadialog opens.
-
In the Filter Bypane, in the Data Source Typesection, select Business Applications.
Alternatively, in the Search for data sourcesfield, you can enter
Vertex AI
. -
In the Featured data sourcessection, click Vertex AI.
-
Click the Vertex AI Models: BigQuery Federationsolution card.
-
In the Connection typelist, select Vertex AI remote models, remote functions and BigLake (Cloud Resource).
-
In the Connection IDfield, type
bigframes-default-connection
. -
Click Create connection.
-
Click Go to connection.
-
In the Connection infopane, copy the service account ID for use in a later step.
Grant permissions to the connection's service account
Grant the connection's service account the roles that it needs to access Cloud Storage and Vertex AI. You must grant these roles in the same project you created or selected in the Before you begin section.
To grant the role, follow these steps:
-
Go to the IAM & Adminpage.
-
Click Grant access.
-
In the New principalsfield, enter the service account ID that you copied earlier.
-
In the Select a rolefield, choose Cloud Storage, and then select Storage Object User.
-
Click Add another role.
-
In the Select a rolefield, select Vertex AI, and then select Vertex AI User.
-
Click Save.
Create a notebook
Create a notebook where you can run Python code:
-
Go to the BigQuerypage.
-
In the tab bar of the editor pane, click the drop-down arrow next to SQL query, and then click Notebook.
-
In the Start with a templatepane, click Close.
-
Click Connect > Connect to a runtime.
-
If you have an existing runtime, accept the default settings and click Connect. If you don't have an existing runtime, select Create new Runtime, and then click Connect.
It might take several minutes for the runtime to get set up.
Create a multimodal DataFrame
Create a multimodal DataFrame that integrates structured and unstructured data
by using the from_glob_path
method
of the Session
class
:
- In the notebook, create a code cell and copy the following code into it:
-
Click Run.
The final call to
df_image
returns the images that have been added to the DataFrame. Alternatively, you could call the.display
method.
Combine structured and unstructured data in the DataFrame
Combine text and image data in the multimodal DataFrame:
- In the notebook, create a code cell and copy the following code into it:
-
Click Run .
The code returns the DataFrame data.
-
In the notebook, create a code cell and copy the following code into it: