Generative AI overview

This document describes the generative artificial intelligence (AI) functions that BigQuery supports. These functions accept natural language inputs and use pre-trained Vertex AI models and built-in BigQuery models.

Overview

BigQuery offers a variety of AI functions to help with tasks such as the following:

  • Generate creative content.
  • Analyze, detect sentiment, and answer questions about text or unstructured data, such as images.
  • Summarize the key ideas or impressions conveyed by the content.
  • Extract structured data from text.
  • Classify text or unstructured data into user defined categories.
  • Generate embeddings to search for similar text, images, and video.
  • Rate inputs in order to rank them by quality, similarity, or other criteria.

The following categories of AI functions to help you accomplish these tasks:

  • General-purpose AI functions :These functions give you full control and transparency on the choice of model, prompt, and parameters to use.

    • Perform LLM inference , such as to answer questions about your data

      • AI.GENERATE is the most flexible inference function, which lets you analyze any structured or unstructured data.
      • AI.GENERATE_TEXT is a table-valued version of AI.GENERATE that also supports partner models and open models.
    • Generate structured output , such as extracting names, addresses, or object descriptions from text, documents, or images.

      • AI.GENERATE , when you specify an output schema.
      • AI.GENERATE_TABLE is a table-valued version of AI.GENERATE that calls a remote model and lets you specify a custom output schema.
      • If your output schema has a single field, you can use one of the specialized functions: AI.GENERATE_BOOL , AI.GENERATE_DOUBLE , or AI.GENERATE_INT .
    • Generate embeddings for semantic search and clustering

      • AI.EMBED : Create an embedding from text or image data.
      • AI.GENERATE_EMBEDDING : A table-valued function that adds a column of embedded text, image, audio, video, or document data to your table.
  • Managed AI functions :These functions have a streamlined syntax and are optimized for cost and quality. BigQuery chooses a model for you.

    • Filter your data with natural language conditions

      • AI.IF
    • Rate input, such as by quality or sentiment

      • AI.SCORE
    • Classify input into user-defined categories

      • AI.CLASSIFY
  • Task-specific functions :These functions use Cloud AI APIs to help you perform tasks such as natural language processing, machine translation, document processing, audio transcription, and computer vision.

General-purpose AI functions

General-purpose AI functions give you full control and transparency on the choice of model, prompt, and parameters to use. Their output includes detailed information about the call to the model, including the status and full model response, which might include information about the safety rating or citations.

Perform LLM inference

The AI.GENERATE function is a flexible inference function that works by sending requests to a Vertex AI Gemini model and returning that model's response. You can use this function to analyze text, image, audio, video, or PDF data. For example, you might analyze images of home furnishings to generate text for a design_type column, so that the furnishings SKU has an associated description, such as mid-century modern or farmhouse .

You can perform generative AI tasks by using remote models in BigQuery ML to reference models deployed to or hosted in Vertex AI with the AI.GENERATE_TEXT table-valued function . You can use the following types of remote models :

Use the following topics to try text generation in BigQuery ML:

For some models, you can optionally choose to configure supervised tuning , which lets you train the model on your own data to make it better suited for your use case. All inference occurs in Vertex AI. The results are stored in BigQuery.

Generate structured data

Structured data generation is very similar to text generation, except that you can format the response from the model by specifying a SQL schema. For example, you might generate a table that contains a customer's name, phone number, address, request, and pricing quote from a transcript of a phone call.

You can generate data structured data in the following ways:

Generate embeddings

An embedding is a high-dimensional numerical vector that represents a given entity, like a piece of text or an audio file. Generating embeddings lets you capture the semantics of your data in a way that makes it easier to reason about and compare the data.

Some common use cases for embedding generation are as follows:

  • Using retrieval-augmented generation (RAG) to augment model responses to user queries by referencing additional data from a trusted source. RAG provides better factual accuracy and response consistency, and also provides access to data that is newer than the model's training data.
  • Performing multimodal search. For example, using text input to search images.
  • Performing semantic search to find similar items for recommendations, substitution, and record deduplication.
  • Creating embeddings to use with a k-means model for clustering.

For more information about how to generate embeddings and use them to perform these tasks, see the Introduction to embeddings and vector search .

Managed AI functions

Managed AI functions are purpose-built to automate routine tasks, such as classification, ordering, or filtering. These functions use Gemini and don't require customization. BigQuery uses prompt engineering and selects the appropriate model and parameters to use for the specific task to optimize the quality and consistency of your results. Each function returns a scalar value, such as a BOOL , FLOAT64 or STRING , and doesn't include additional status information from the model. The following managed AI functions are available:

  • AI.IF : Filter text or multi-modal data, such as in a WHERE or JOIN clause, based on a prompt. For example, you could filter product descriptions by those that describe an item that would make a good gift.
  • AI.SCORE : Rate inputs based on a prompt in order to rank rows by quality, similarity, or other criteria. You can use this function in an ORDER BY clause to extract the top K items according to score. For example, you could find the top 10 most positive or negative user reviews for a product.
  • AI.CLASSIFY : Classify text into user-defined categories. You can use this function in a GROUP BY clause to group inputs according to the categories that you define. For example, you could classify support tickets by whether they relate to billing, shipping, product quality, or something else.

For a tutorial that shows examples of how to use these functions, see Perform semantic analysis with managed AI functions .

For a notebook tutorial that shows how to use managed and general-purpose AI functions, see Semantic analysis with AI functions .

Task-specific functions

In addition to the more general functions described in the previous sections, you can develop task-specific solutions in BigQuery ML by using Cloud AI APIs. Supported tasks include the following:

For more information, see Task-specific solutions overview .

Locations

Supported locations for text generation and embedding models vary based on the model type and version that you use. For more information, see Locations .

Pricing

You are charged for the compute resources that you use to run queries against models. Remote models make calls to Vertex AI models, so queries against remote models also incur charges from Vertex AI.

For more information, see BigQuery ML pricing .

Track costs

The generative AI functions in BigQuery work by sending requests to Vertex AI, which can incur costs. To track the Vertex AI costs incurred by a job that you run in BigQuery, follow these steps:

  1. View your billing reports in Cloud Billing.
  2. Use filters to refine your results.

    For services, select Vertex AI.

  3. To see the charges for a specific job, filter by label .

    Set the key to bigquery_job_id_prefix and the value to the job ID of your job. If your job ID is longer than 63 characters, only use the first 63 characters. If your job ID contains any uppercase characters, change them to lowercase. Alternatively, you can associate jobs with a custom label to help you look them up later.

It can take up to 24 hours for some charges to appear in Cloud Billing.

Monitoring

To better understand the behavior of AI functions that you call in BigQuery, you can enable request and response logging. To log the entire request and response sent to and received from Vertex AI, follow these steps:

  1. Enable request-response logs in Vertex AI. The logs are stored in BigQuery. You must separately enable logging for each different foundation model and region. To log queries that run in the us region, specify the us-central1 region in your request. To log queries that run in the eu region, specify the europe-west4 region in your request.

  2. Run a query using an AI function that makes a call to Vertex AI using the model that you enabled logging for in the previous step.

  3. To view the full Vertex AI request and response, query your logging table for rows where the labels.bigquery_job_id_prefix field of the full_request column matches the first 63 characters of your job ID . Optionally, you can use a custom query label to help you look up the query in the logs.

    For example, you can use a query similar to the following:

      SELECT 
      
     * 
     FROM 
      
     `my_project.my_dataset.request_response_logging` 
     WHERE 
      
     JSON_VALUE 
     ( 
     full_request 
     , 
      
     '$.labels.bigquery_job_id_prefix' 
     ) 
      
     = 
      
     'bquxjob_123456...' 
     ; 
     
    

What's next

Create a Mobile Website
View Site in Mobile | Classic
Share by: