Starting April 29, 2025, Gemini 1.5 Pro and Gemini 1.5 Flash models are not available in projects that have no prior usage of these models, including new projects. For details, seeModel versions and lifecycle.
Stay organized with collectionsSave and categorize content based on your preferences.
This document describes how to use the Vertex AI prompt optimizer to
automatically optimize prompt performance by improving thesystem
instructionsfor a set of
prompts.
The Vertex AI prompt optimizer can help you improve your prompts
quickly at scale, without manually rewriting system instructions or individual
prompts. This is especially useful when you want to use system instructions and
prompts that were written for one model with a different model.
We offer two approaches for optimizing prompts:
Thezero-shot
optimizeris a real-time low-latency optimizer that improves a single prompt or system
instruction template. It is fast and requires no additional setup besides
providing your original prompt or system instruction.
Thedata-driven
optimizeris a batch task-level iterative optimizer that improves prompts by
evaluating the model's response to sample labeled prompts against specified
evaluation metrics for your selected target model. It's for more advanced
optimization that lets you configure the optimization parameters and provide
a few labeled samples.
These methods are available to users through the user interface (UI) or the
Vertex AI SDK.
Supported target models for optimization
The zero-shot optimizer is model independent and can improve prompts for any
Google model.
The data-driven optimizer supports optimization for only generally available
Gemini models.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-09-04 UTC."],[],[],null,["# Optimize prompts\n\nThis document describes how to use the Vertex AI prompt optimizer to\nautomatically optimize prompt performance by improving the [system\ninstructions](/vertex-ai/generative-ai/docs/learn/prompts/system-instructions) for a set of\nprompts.\n\nThe Vertex AI prompt optimizer can help you improve your prompts\nquickly at scale, without manually rewriting system instructions or individual\nprompts. This is especially useful when you want to use system instructions and\nprompts that were written for one model with a different model.\n\nWe offer two approaches for optimizing prompts:\n\n- The [**zero-shot\n optimizer**](/vertex-ai/generative-ai/docs/learn/prompts/zero-shot-optimizer) is a real-time low-latency optimizer that improves a single prompt or system instruction template. It is fast and requires no additional setup besides providing your original prompt or system instruction.\n- The [**data-driven\n optimizer**](/vertex-ai/generative-ai/docs/learn/prompts/data-driven-optimizer) is a batch task-level iterative optimizer that improves prompts by evaluating the model's response to sample labeled prompts against specified evaluation metrics for your selected target model. It's for more advanced optimization that lets you configure the optimization parameters and provide a few labeled samples.\n\nThese methods are available to users through the user interface (UI) or the\nVertex AI SDK.\n| To see an example of optimizing prompts, run one of the following Jupyter notebooks:\n|\n| - Vertex AI prompt optimizer: [Open in Colab](https://colab.research.google.com/github/GoogleCloudPlatform/generative-ai/blob/main/gemini/prompts/prompt_optimizer/vertex_ai_prompt_optimizer_ui.ipynb) \\| [Open in Colab Enterprise](https://console.cloud.google.com/vertex-ai/colab/import/https%3A%2F%2Fraw.githubusercontent.com%2FGoogleCloudPlatform%2Fgenerative-ai%2Fmain%2Fgemini%2Fprompts%2Fprompt_optimizer%2Fvertex_ai_prompt_optimizer_ui.ipynb) \\| [Open in Vertex AI Workbench user-managed notebooks](https://console.cloud.google.com/vertex-ai/workbench/deploy-notebook?download_url=https%3A%2F%2Fraw.githubusercontent.com%2FGoogleCloudPlatform%2Fgenerative-ai%2Fmain%2Fgemini%2Fprompts%2Fprompt_optimizer%2Fvertex_ai_prompt_optimizer_ui.ipynb) \\| [View on GitHub](https://github.com/GoogleCloudPlatform/generative-ai/blob/main/gemini/prompts/prompt_optimizer/vertex_ai_prompt_optimizer_ui.ipynb)\n| - Vertex AI prompt optimizer SDK: [Open in Colab](https://colab.research.google.com/github/GoogleCloudPlatform/generative-ai/blob/main/gemini/prompts/prompt_optimizer/vertex_ai_prompt_optimizer_sdk.ipynb) \\| [Open in Colab Enterprise](https://console.cloud.google.com/vertex-ai/colab/import/https%3A%2F%2Fraw.githubusercontent.com%2FGoogleCloudPlatform%2Fgenerative-ai%2Fmain%2Fgemini%2Fprompts%2Fprompt_optimizer%2Fvertex_ai_prompt_optimizer_sdk.ipynb) \\| [Open in Vertex AI Workbench user-managed notebooks](https://console.cloud.google.com/vertex-ai/workbench/deploy-notebook?download_url=https%3A%2F%2Fraw.githubusercontent.com%2FGoogleCloudPlatform%2Fgenerative-ai%2Fmain%2Fgemini%2Fprompts%2Fprompt_optimizer%2Fvertex_ai_prompt_optimizer_sdk.ipynb) \\| [View on GitHub](https://github.com/GoogleCloudPlatform/generative-ai/blob/main/gemini/prompts/prompt_optimizer/vertex_ai_prompt_optimizer_sdk.ipynb)\n| - Vertex AI prompt optimizer custom metrics: [Open in Colab](https://colab.research.google.com/github/GoogleCloudPlatform/generative-ai/blob/main/gemini/prompts/prompt_optimizer/vertex_ai_prompt_optimizer_sdk_custom_metric.ipynb) \\| [Open in Colab Enterprise](https://console.cloud.google.com/vertex-ai/colab/import/https%3A%2F%2Fraw.githubusercontent.com%2FGoogleCloudPlatform%2Fgenerative-ai%2Fmain%2Fgemini%2Fprompts%2Fprompt_optimizer%2Fvertex_ai_prompt_optimizer_sdk_custom_metric.ipynb) \\| [Open in Vertex AI Workbench user-managed notebooks](https://console.cloud.google.com/vertex-ai/workbench/deploy-notebook?download_url=https%3A%2F%2Fraw.githubusercontent.com%2FGoogleCloudPlatform%2Fgenerative-ai%2Fmain%2Fgemini%2Fprompts%2Fprompt_optimizer%2Fvertex_ai_prompt_optimizer_sdk_custom_metric.ipynb) \\| [View on GitHub](https://github.com/GoogleCloudPlatform/generative-ai/blob/main/gemini/prompts/prompt_optimizer/vertex_ai_prompt_optimizer_sdk_custom_metric.ipynb)\n| **Important:** While Prompt Optimizer is [generally available](/products#product-launch-stages), its SDK library is still experimental. This means that the SDK is subject to change at any time without notice. We are continuously working to improve and stabilize the SDK. You may encounter bugs or changes to APIs and functionality.\n\nSupported target models for optimization\n----------------------------------------\n\nThe zero-shot optimizer is model independent and can improve prompts for any\nGoogle model.\n\nThe data-driven optimizer supports optimization for only generally available\nGemini models.\n\nWhat's next\n-----------\n\n- Learn about [zero-shot optimizer](/vertex-ai/generative-ai/docs/learn/prompts/zero-shot-optimizer)\n\n- Learn about [data-driven optimizer](/vertex-ai/generative-ai/docs/learn/prompts/data-driven-optimizer)"]]