Docker script

Writing a script in Docker to set up your environment

You can query a model directly and test the results returned when using different parameter values with the Cloud console, or by calling the Vertex AI API directly.

Prompt Freeform

Generate a Docker script to create a simple Linux machine that has Python 3.10 (prebuilt) installed with following libraries: pandas, tensorflow, numpy

Response

Of course! Here is a complete solution using a Dockerfile . This approach is considered best practice as it is clean, reproducible, and easy to maintain.

We will create two files:

  1. Dockerfile : The script that tells Docker how to build the image.
  2. requirements.txt : A file listing the Python libraries to install.

This separation makes it easy to update your Python libraries without changing the core Dockerfile logic.


Step 1: Create a requirements.txt file

Create a file named requirements.txt and add the names of the libraries you want to install.

requirements.txt

 pandas
tensorflow
numpy 

Note: While pandas and tensorflow both depend on numpy , listing it explicitly is fine and ensures it's included.


Step 2: Create the Dockerfile

Create a file named Dockerfile (with no extension) in the same directory. This script defines your Linux environment.

Dockerfile

  # Step 1: Use an official Python 3.10 image as the base. 
 # 'slim-bullseye' is a good choice as it's smaller than the default 
 # but still has the necessary tools. 
 FROM 
  
 python:3.10-slim-bullseye 
 # Step 2: Set the working directory inside the container. 
 # This is where your commands will run and files will be placed. 
 WORKDIR 
  
 /app 
 # Step 3: Copy the requirements file into the container. 
 # This is done before installing to leverage Docker's layer caching. 
 # If requirements.txt doesn't change, Docker won't re-run the installation. 
 COPY 
  
requirements.txt  
. # Step 4: Install the Python libraries. 
 # --no-cache-dir: Reduces the final image size by not storing the pip cache. 
 # --upgrade pip: Ensures we have the latest version of pip. 
 RUN 
  
pip  
install  
--no-cache-dir  
--upgrade  
pip && 
 \ 
  
pip  
install  
--no-cache-dir  
-r  
requirements.txt # Step 5: Define the default command to run when the container starts. 
 # 'bash' will drop you into an interactive shell inside the Linux machine, 
 # allowing you to run python, check files, etc. 
 CMD 
  
 [ 
 "bash" 
 ] 
 

How to Use It

  1. Save the files:Make sure both Dockerfile and requirements.txt are in the same empty directory.

  2. Build the Docker image:Open your terminal, navigate to that directory, and run the following command. We'll name ( -t ) our image python-ml-env .

     docker  
    build  
    -t  
    python-ml-env  
    . 
    

    (The . at the end is important; it tells Docker to look for the Dockerfile in the current directory.)

  3. Run the container:Once the build is complete, you can start a container from the image. The -it flags make the session interactive.

     docker  
    run  
    -it  
    --rm  
    python-ml-env 
    
    • -it gives you an interactive terminal.
    • --rm automatically removes the container when you exit, keeping your system clean.

You will be dropped into a bash prompt inside your new Linux environment, like this: root@<container_id>:/app#

  1. Verify the installation:Inside the container's shell, you can start Python and import the libraries to confirm they are installed correctly.

      # Start the Python interpreter 
    python3 # Inside the Python interpreter, type: 
    >>>  
    import  
    pandas  
    as  
    pd
    >>>  
    import  
    tensorflow  
    as  
    tf
    >>>  
    import  
    numpy  
    as  
    np
    >>>  
    print ( 
    f "TensorFlow version: {tf.__version__}" 
     ) 
    >>>  
    print ( 
    f "Pandas version: {pd.__version__}" 
     ) 
    >>>  
    print ( 
    f "NumPy version: {np.__version__}" 
     ) 
    >>>  
    exit () 
     
    

To exit the container's bash shell, simply type exit .

Alternative (Simpler, Less Maintainable) Dockerfile

If you prefer to have everything in a single file for a very simple, one-off case, you can write the pip install command directly in the Dockerfile without a requirements.txt file.

Dockerfile (single-file version)

  # Use an official Python 3.10 image 
 FROM 
  
 python:3.10-slim-bullseye 
 # Install the specified libraries directly 
 RUN 
  
pip  
install  
pandas  
tensorflow  
numpy # Drop into a bash shell on start 
 CMD 
  
 [ 
 "bash" 
 ] 
 

This version is less optimal because any change to the list of libraries will cause Docker to re-run the entire installation from scratch, ignoring any cache. The first method using requirements.txt is highly recommended for any serious project.

Model: gemini-2.5-pro
Temperature: 0.2
Max output tokens: 8192
TopK: 40.0
TopP: 0.95
Create a Mobile Website
View Site in Mobile | Classic
Share by: