Stay organized with collectionsSave and categorize content based on your preferences.
Choose a text generation function
This document provides a comparison of the BigQuery MLML.GENERATE_TEXTandAI.GENERATEtext generation functions. You can use the information in this document to
help you decide which function to use in cases where the functions have
overlapping capabilities.
Function similarities
TheML.GENERATE_TEXTandAI.GENERATEfunctions are similar in the
following ways:
Purpose: Generate text by passing a prompt to a large language model
(LLM).
Billing: Incur BigQuery ML charges for data processed.
For more information, seeBigQuery ML pricing.
Incur Vertex AI charges for calls to the LLM. If you are using
a Gemini 2.0 or greater model, the call is billed at the batch
API rate. For more information, seeCost of building and deploying AI models in Vertex AI.
Scalability: Process between 1 million and 10 million rows for each a
6-hour query job. Actual throughput depends on factors like the average token
length in the input rows. For more information, seeGenerative AI functions.
Input data: Support both text and unstructured data from
BigQuery standard tables and object tables.
Function differences
Use the following table to evaluate the differences between theML.GENERATE_TEXTandAI.GENERATEfunctions:
ML.GENERATE_TEXT
AI.GENERATE
Function signature
A table-valued function that takes a table as input and returns a table as output.
A scalar function that takes a single value as input and returns a single value as output.
Supported LLMs
Gemini models
Partner models such as Anthropic Claude, Llama, and Mistral AI
open models
Gemini models
Function output content
Function output content for Gemini models:
Generated text
Responsible AI (RAI) results
Google Search grounding results, if enabled
LLM call status
Function output content for other types of models:
Generated text
LLM call status
Generated text
Full model response in JSON format
LLM call status
Function output format
Generated values are returned in a single JSON column or in separate table columns, depending
on theflatten_json_outputargument value.
Generated values are returned as fields in aSTRUCTobject.
User journey
You must create aremote modelbefore using the function.
You can use the function directly, without the need to create a remote model.
Permission setup
You must manually create a BigQuery connection, and
grant the Vertex AI User role permission to the service account of the
connection. You can skip this step if you are using the
BigQuerydefault connection.
You must manually create a BigQuery connection, and
grant the Vertex AI User role permission to the service account of the
connection.
Advantages
Allows for more flexible input and output formats.
Easier to integrate into SQL queries.
Extended functions
You can use theAI.GENERATE_TABLEfunctionto generate output that is structured according to a SQL output schema that you specify.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-09-10 UTC."],[],[],null,["Choose a text generation function\n\nThis document provides a comparison of the BigQuery ML\n[`ML.GENERATE_TEXT`](/bigquery/docs/reference/standard-sql/bigqueryml-syntax-generate-text)\nand\n[`AI.GENERATE`](/bigquery/docs/reference/standard-sql/bigqueryml-syntax-ai-generate)\ntext generation functions. You can use the information in this document to\nhelp you decide which function to use in cases where the functions have\noverlapping capabilities.\n\nFunction similarities\n\nThe `ML.GENERATE_TEXT` and `AI.GENERATE` functions are similar in the\nfollowing ways:\n\n- **Purpose**: Generate text by passing a prompt to a large language model (LLM).\n- **Billing** : Incur BigQuery ML charges for data processed. For more information, see [BigQuery ML pricing](/bigquery/pricing#bigquery-ml-pricing%22%3E). Incur Vertex AI charges for calls to the LLM. If you are using a Gemini 2.0 or greater model, the call is billed at the batch API rate. For more information, see [Cost of building and deploying AI models in Vertex AI](/vertex-ai/generative-ai/pricing%22).\n- **Scalability** : Process between 1 million and 10 million rows for each a 6-hour query job. Actual throughput depends on factors like the average token length in the input rows. For more information, see [Generative AI functions](/bigquery/quotas#generative_ai_functions).\n- **Input data**: Support both text and unstructured data from BigQuery standard tables and object tables.\n\nFunction differences\n\nUse the following table to evaluate the differences between the\n`ML.GENERATE_TEXT` and `AI.GENERATE` functions:\n\n| | `ML.GENERATE_TEXT` | `AI.GENERATE` |\n|-------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| Function signature | A table-valued function that takes a table as input and returns a table as output. | A scalar function that takes a single value as input and returns a single value as output. |\n| Supported LLMs | - Gemini models - Partner models such as Anthropic Claude, Llama, and Mistral AI - open models | Gemini models |\n| Function output content | Function output content for Gemini models: - Generated text - Responsible AI (RAI) results - Google Search grounding results, if enabled - LLM call status Function output content for other types of models: - Generated text - LLM call status | - Generated text - Full model response in JSON format - LLM call status |\n| Function output format | Generated values are returned in a single JSON column or in separate table columns, depending on the `flatten_json_output` argument value. | Generated values are returned as fields in a `STRUCT` object. |\n| User journey | You must create a [remote model](/bigquery/docs/reference/standard-sql/bigqueryml-syntax-create-remote-model#create_model_syntax) before using the function. | You can use the function directly, without the need to create a remote model. |\n| Permission setup | You must manually create a BigQuery connection, and grant the Vertex AI User role permission to the service account of the connection. You can skip this step if you are using the BigQuery [default connection](/bigquery/docs/default-connections#example-remote-model). | You must manually create a BigQuery connection, and grant the Vertex AI User role permission to the service account of the connection. |\n| Advantages | Allows for more flexible input and output formats. | Easier to integrate into SQL queries. |\n| Extended functions | You can use the [`AI.GENERATE_TABLE` function](/bigquery/docs/reference/standard-sql/bigqueryml-syntax-generate-table) to generate output that is structured according to a SQL output schema that you specify. | You can use the [`AI.GENERATE_BOOL`](/bigquery/docs/reference/standard-sql/bigqueryml-syntax-ai-generate-bool), [`AI.GENERATE_INT`](/bigquery/docs/reference/standard-sql/bigqueryml-syntax-ai-generate-int), and [`AI.GENERATE_DOUBLE`](/bigquery/docs/reference/standard-sql/bigqueryml-syntax-ai-generate-double) functions to generate different types of scalar values. |"]]