This page describes how to query using AI powered SQL operators provided by AI functions. You can use the ai.if 
for filters, ai.rank 
, and ai.generate 
operators to combine natural language with SQL
queries.
To use instructions on this page, you must have an understanding of AlloyDB and be familiar with generative AI concepts.
AlloyDB AI reserves and creates the  ai 
 
schema.
Before you begin
Before you use natural language in SQL operators, do the following:
-  Verify that the google_ml_integrationextension is installed .
-  Verify that the google_ml_integration.enable_model_supportflag is set toon.
- Integrate with Vertex AI .
- Use a Gemini model that's supported in your region .
Integrate with Vertex AI and install the extension
- Integrate with Vertex AI .
- Ensure that the latest version of google_ml_integrationis installed.-  To check the installed version, run the following command: SELECT extversion FROM pg_extension WHERE extname = 'google_ml_integration' ; extversion 
 1 .4.3 ( 1 row )
-  If the extension isn't installed or if the installed version is earlier than 1.4.3, update the extension by running the following commands: CREATE EXTENSION IF NOT EXISTS google_ml_integration ; ALTER EXTENSION google_ml_integration UPDATE ; If you experience issues when you run the preceding commands, or if the extension isn't updated to version 1.4.3 after you run the preceding commands, contact AlloyDB support. 
-  After you ensure that the version is current, install the preview functionality by running the upgrade_to_preview_versionprocedure:CALL google_ml.upgrade_to_preview_version () ; SELECT extversion FROM pg_extension WHERE extname = 'google_ml_integration' ; extversion 
 1 .4.4 ( 1 row )
 
-  
Use a Gemini model that's supported in your region
If your AlloyDB for PostgreSQL cluster is in a region where gemini-2.0-flash 
isn't supported, you can use one of the other available Gemini models
in your region using the model_id parameter 
.
Alternatively, you can register a Gemini model endpoint and provide that model ID to the AI operators. For more information, see Register and call remote AI models using model endpoint management .
The following example shows how to register another Gemini
endpoint. In this example, this second Gemini endpoint is the
global endpoint for gemini-2.0-flash 
. You can use this registered model
with AI operators by passing model_id => 
gemini-2.0-flash-global` as an
additional argument.
  CALL 
  
 google_ml 
 . 
 create_model 
 ( 
  
 model_id 
  
 = 
>  
 'gemini-2.0-flash-global' 
 , 
  
 model_type 
  
 = 
>  
 'llm' 
 , 
  
 model_provider 
  
 = 
>  
 'google' 
 , 
  
 model_qualified_name 
  
 = 
>  
 'gemini-2.0-flash' 
 , 
  
 model_request_url 
  
 = 
>  
 'https://aiplatform.googleapis.com/v1/projects/<project_id>/locations/global/publishers/google/models/gemini-2.0-flash:generateContent' 
 , 
  
 model_auth_type 
  
 = 
>  
 'alloydb_service_agent_iam' 
 ); 
 
 
Use filters in your queries
AlloyDB AI offers several AI-powered SQL functions that let you
use natural language processing and LLMs directly
within your database queries, including the ai.if 
and ai.rank 
operators.
Filters
To evaluate whether a condition stated in natural language is met, use the ai.if 
/ google_ml.if 
operator. The function returns boolean true or false
value, and returns false 
if the output isn't clearly detected.
 - Function signature
FUNCTION ai.if(prompt TEXT, model_id VARCHAR(100) DEFAULT NULL) RETURNS bool 
 
The following example shows the use of the ai.if 
operator as a filter to find
restaurants with more than 500 positive reviews located in cities with a
population greater than 100,000. The example uses restaurant_reviews 
and
contains data like reviews and city location. The ai.if 
operator helps you
understand the review sentiment and combine
the locations from the database with Gemini general knowledge
about the population in these locations.
  SELECT 
  
 r 
 . 
 name 
 , 
  
 r 
 . 
 location_city 
 FROM 
  
 restaurant_reviews 
  
 r 
 WHERE 
  
 AI 
 . 
 IF 
 ( 
 r 
 . 
 location_city 
  
 || 
  
 ' has a population OF more than 100,000 AND the following is a positive review; Review: ' 
  
 || 
  
 r 
 . 
 review 
 ) 
 GROUP 
  
 BY 
  
 r 
 . 
 name 
 , 
  
 r 
 . 
 location_city 
 HAVING 
  
 COUNT 
 ( 
 * 
 ) 
 > 
 500 
 ; 
 
 
The following shows the same example using the model that you registered in Use a Gemini model that's supported in your region .
  SELECT 
  
 r 
 . 
 name 
 , 
  
 r 
 . 
 location_city 
 FROM 
  
 restaurant_reviews 
  
 r 
 WHERE 
  
 AI 
 . 
 IF 
 ( 
 r 
 . 
 location_city 
  
 || 
  
 ' has a population of more than 100,000 AND the following is a positive review; Review: ' 
  
 || 
  
 r 
 . 
 review 
 , 
  
 model_id 
  
 = 
>  
 'gemini-2.0-flash-global' 
 ) 
 GROUP 
  
 BY 
  
 r 
 . 
 name 
 , 
  
 r 
 . 
 location_city 
 HAVING 
  
 COUNT 
 ( 
 * 
 ) 
 > 
 500 
 ; 
 
 
Perform a join on a query that uses the if operator
To perform a join operation, use the ai.if 
/ google_ml.if 
operator with join. The following sample query finds the number of reviews that mention each menu item from the restaurant menu.
   
 SELECT 
  
 item_name 
 , 
  
 COUNT 
 ( 
 * 
 ) 
  
 FROM 
  
 menu_items 
  
 JOIN 
  
 user_reviews 
  
 ON 
  
 ai 
 . 
 if 
 ( 
  
 prompt 
  
 = 
>  
 'Does the following user review talk about the menu item mentioned ? review: ' 
  
 || 
  
 user_reviews 
 . 
 review_text 
  
 || 
  
 ' menu item: ' 
  
 || 
  
 item_name 
 ) 
  
 GROUP 
  
 BY 
  
 item_name 
 ; 
 
 
Text generation and summarization
The ai.generate 
function produces text by combining provided data with the
user's prompt.
  -- Function Signature 
 FUNCTION 
  
 ai 
 . 
 generate 
 ( 
 prompt 
  
 TEXT 
 , 
  
 model_id 
  
 VARCHAR 
 ( 
 100 
 ) 
  
 DEFAULT 
  
 NULL 
 ) 
  
 RETURNS 
  
 TEXT 
 
 
For example, you can use the following query to generate a concise summary of each user review.
  SELECT 
  
 ai 
 . 
 generate 
 ( 
  
 prompt 
  
 = 
>  
 'Summarize the review in 20 words or less. Review: ' 
  
 || 
  
 review 
 ) 
  
 AS 
  
 review_summary 
 FROM 
  
 user_reviews 
 
 
Score your query results
If you need to sort query results using custom, natural language instructions,
use the ai.rank 
operator. This function lets you provide a prompt which
describes the ranking criteria and returns a score for each item.
  -- Function signature 
 FUNCTION 
  
 ai 
 . 
 rank 
 ( 
 prompt 
  
 TEXT 
 , 
  
 model_id 
  
 VARCHAR 
 ( 
 100 
 ) 
  
 DEFAULT 
  
 NULL 
 ) 
  
 RETURNS 
  
 real 
 
 
For example, the following query gets the top 20 most positive restaurant reviews, using scores from an LLM.
  SELECT 
  
 review 
  
 AS 
  
 top20 
 FROM 
  
 user_reviews 
 ORDER 
  
 BY 
  
 ai 
 . 
 rank 
 ( 
  
 'Score the following review according to these rules: 
 (1) Score OF 8 to 10 IF the review says the food IS excellent. 
 (2) 4 to 7 IF the review says the food is ok. 
 (3) 1 to 3 IF the review says the food is not good. Here is the review:' 
  
 || 
  
 review 
 ) 
  
 DESC 
 LIMIT 
  
 20 
 ; 
 
 

