There are two ways to specify dependencies for Cloud Run functions
written in Python: using the package
manager's requirements.txt
file or packaging local dependencies alongside your
function.
Dependency specification using the Pipfile/Pipfile.lock
standard is
not supported. Your project shouldn't include these files.
The Functions Framework is a required dependency for all functions. Although Cloud Run installs it on your behalf when the function is created, we recommend that you include it as an explicit dependency.
Specify dependencies
Specify your application dependencies in a requirements.txt
file in the root
directory. This file must be in the same directory as the main.py
file that
contains your source code. The requirements.txt
file contains one line per package.
Each line contains the package name, and optionally, the requested version. To prevent
your build from being affected by dependency version changes, consider pinning your
dependency packages to a specific version.
The following is an example requirements.txt
file:
functions-framework requests==2.20.0 numpy
Python 3.14 and later
Starting from Python version 3.14 (preview) and later, the Python
buildpack uses the UV
package manager as the default installer for the dependencies you specify
in your requirements.txt
file.
To use pip
as the package manager,
configure the environment variable GOOGLE_PYTHON_PACKAGE_MANAGER="pip"
.
Run the gcloud run deploy
command to set the package manager environment
variable to pip
:
gcloud run deploy SERVICE
--source . \
--set-build-env-vars=GOOGLE_PYTHON_PACKAGE_MANAGER=pip
Replace SERVICE with the name of your Cloud Run service.
Python 3.13 and earlier
For Python version 3.13 and earlier, the Python
buildpack uses the pip
package manager to install dependencies you define in the requirements.txt
file.
To use UV
(preview) as the package manager,
configure the environment variable GOOGLE_PYTHON_PACKAGE_MANAGER="uv"
.
Run the gcloud beta run deploy
command to set the package manager environment
variable to uv
:
gcloud beta run deploy SERVICE
--source . \
--set-build-env-vars=GOOGLE_PYTHON_PACKAGE_MANAGER=uv
Replace SERVICE with the name of your Cloud Run service.
Package local dependencies
You can also package and deploy dependencies alongside your function. This
approach is useful if your dependency isn't available using the uv
or the pip
package manager or if your Cloud Run environment's
internet access is restricted.
For example, you might use a directory structure such as the following:
myfunction/
├── main.py
└── localpackage/
├── __init__.py
└── script.py
You can then import the code as usual from localpackage
using the following import
statement.
# Code in main.py from localpackage import script
Note that this approach will not
run any setup.py
files. Packages with those
files can still be bundled, but may not run correctly on
Cloud Run functions.
Copied dependencies
Copied dependencies are dependencies whose source is included directly
in your source code package and rebuilt alongside your own code.
Use the GOOGLE_VENDOR_PIP_DEPENDENCIES
build environment variable
to create copied pip dependencies and avoid installing them
during deployment.
Create copied dependencies
-
Ensure that python3 is installed on your development system.
-
Declare your application dependencies in a
requirements.txtfile in the root directory of your development tree. -
Declare Functions Framework as a requirement by including
functions-frameworkon a separate line in yourrequirements.txtfile. -
Download your function's dependencies to your local directory. The steps to do this depend on whether the dependency is a Python wheel (*.whl) file or a tar file (*.tar.gz).
-
If the dependency is a Python wheel (*.whl), download it into the root directory of your development tree with this pip command:
python3 - m pip download - r requirements . txt -- only - binary = : all : \ - d DIRECTORY \ -- python - version PYTHON_RUNTIME_VERSION \ -- platform manylinux2014_x86_64 \ -- implementation cpReplace:
- DIRECTORY : the name of the local directory to download to.
- PYTHON_RUNTIME_VERSION
: the Python version to use for
compatibility checks. For example
311for Python 3.11.
This version must match one of the supported Python runtimes .
The resulting directory structure should look like this:
myfunction/ ├── main.py └── requirements.txt └── DIRECTORY ├── dependency1.whl └── dependency2.whl
-
If the dependency is a tar file (*.tar.gz):
-
If the dependency is written in Python, use pip to download it:
python3 - m pip download - r requirements . txt \ - d DIRECTORY -
If a dependency consists of code written in C or C++, you must download and compile it separately.
-
-
-
Deploy your function and its copied dependencies:
gcloud functions deploy FUNCTION_NAME \ --runtime PYTHON_RUNTIME_NAME \ --set-build-env-vars GOOGLE_VENDOR_PIP_DEPENDENCIES = DIRECTORYReplace:
- FUNCTION_NAME : the name of the function you're deploying.
- PYTHON_RUNTIME_NAME : the name of one of the supported Python runtimes to run your deployed function under - for example python311. This must be the same Python runtime version as you've used in your local development environment.
- DIRECTORY : the name of the directory containing your copied dependencies.
For more details about using buildpacks, see Build a function with buildpacks .
Use private dependencies
You can use private dependencies from Artifact Registry or from other repositories.
Private dependencies from Artifact Registry
An Artifact Registry Python
repository
can host private
dependencies for your Python function. When deploying to Cloud Run,
the build process will automatically generate Artifact Registry credentials for the Cloud Build service account
. You only
need to include the Artifact Registry URL in your requirements.txt
without
generating additional credentials. For example:
--
index
-
url
REPOSITORY_URL
sampleapp
Flask
==
0.10.1
google
-
cloud
-
storage
If your build needs multiple repositories, use an Artifact Registry virtual repository to safely control the order that pip searches your repositories.
Private dependencies from other repositories
Dependencies are installed in a Cloud Build environment that does not provide access to SSH keys. Packages hosted in repositories that require SSH-based authentication must be copied and uploaded alongside your project's code, as described in the previous section.
You can use the pip install
command with the -t DIRECTORY
flag to copy private dependencies into
a local directory before deploying your app, as follows:
-
Copy your dependency into a local directory:
pip install -t DIRECTORY DEPENDENCY -
Add an empty
__init__.pyfile to theDIRECTORYdirectory to turn it into a module. -
Import from this module to use your dependency:
import DIRECTORY . DEPENDENCY
Pre-installed packages
The following Python packages are automatically installed alongside your
function during deployment. If you are using any of these packages in your
function code, we recommend that you include the following versions in your requirements.txt
file:
Python 3.8 and later
Python 3.7
-
pip(latest version) -
setuptools(latest version) -
wheel(determined by product requirements)
In addition, the Python runtime includes a number of system packages in the execution environment.

