By Aimpoint Digital, A Gurobi Alliance Premier Service Partner

This article originally appeared on aimpointdigital.com.

In part one of our two-part series on utilizing Gurobi in Snowflake, we demonstrated how Gurobi models can be executed within Snowflake via Snowpark Container Services (SPCS). This unlocks the potential of advanced decision science workflows for many applications, like:  

Streamlining the Process with Snowsight

In this blog, we will discuss how the process can be further streamlined using Snowsight. Snowsight is the Snowflake web interface that provides a unified access for executing queries in SQL or Python. With the help of the PuLP package, which is available via the Snowflake Anaconda Distribution, we can construct the optimization model, transmit it to a REST API hosted in SPCS for execution, and retrieve the model results — all coordinated within the Snowflake Snowsight environment.

What Are the Benefits of Using Snowsight for Solving Optimization Models?

Snowsight streamlines optimization model execution by offering improved usability, faster execution, and enhanced auditability. Your optimization tasks can seamlessly integrate into Snowflake’s environment, leveraging SPCS services for efficient processing, all while maintaining comprehensive tracking and auditing capabilities within Snowsight.  

  • Improved Usability: Running Gurobi models utilizing the method outlined in our previous blog requires the user to interact with a Jupyter service executing Python code. Instead, the method described here allows execution of Gurobi models via the same unified interface that analysts routinely use for all data querying: making access easier and thereby increasing adoption.  
  • Faster Execution: By carrying out all the data processing and modeling within the Snowsight worksheet, we leverage Snowflake’s distributed architecture and parallel processing capabilities and ensure only the necessary data I/O happens with the API service and SCPS. The minimized data movement and associated latency speeds up execution.  
  • Improved Auditability: Executing Gurobi models via Snowsight is the same as executing any SQL or Python queries within Snowflake. This ensures that all the experimentation of different models is tracked via the query history and provides an audit log of modeling decisions. 

How Does Running Gurobi Models in Snowsight Work?

Running models in Snowsight involves transmitting them to SPCS. This is essential because as of March 2024, Gurobi models can only be executed in the Snowflake ecosystem using SPCS.  

Figure 1. Architecture diagram of calls made from and to Snowsight and SPCS

1. Build a Docker image that: 

  • Installs required packages – including Gurobi
  • Mounts the working directory with the Python script containing the Flask API
  • Exposes the appropriate port
  • Initiates the REST API 

2. Define the Snowpark Container Service through an appropriate YAML file. The file includes the required configurations, settings, and parameters needed to execute Gurobi models in your Snowflake account. 

  • Push the Docker image and the YAML file 

3. Create the SPCS Service. You will need:

  • A designated service name
  • A compute pool for service execution
  • A specification file outlining the service’s configuration (YAML file defined in Step 

4. Define your Gurobi model in a Python worksheet in Snowsight using PuLP. PuLP is available as one of the default packages listed in the Snowflake Anaconda Distribution. 

5. Run the model by invoking a Snowflake User-Defined Function (UDF) that sends the PuLP model as a dictionary to the REST API.

6. The REST API will initialize the compute pool and solve the optimization model. 

7. Continue your work in the Python worksheet as usual with the JSON results returned by the UDF.  

Creating a Gurobi Solver REST API Service in SPCS 

Follow Snowflake’s quickstarts guide to complete the initial SPCS setup. Parts 2, 3, and 5 from the guide provide a detailed overview of the SPCS infrastructure and step-by-step instructions on how to get started. The following bullet points provide valuable tips to assist you in setting up the Gurobi solver API within SPCS and executing PuLP models seamlessly in Snowsight. 

Step 1: Setting up Docker image 

  • Your requirements.txt file in your Docker container must include both pulp and gurobipy. The pulp module is necessary for reading the model dictionary from the Snowsight Python worksheet, while gurobipy is essential for solving the model. 
  • The Python REST API script should instantiate a PuLP model based on the incoming model dictionary (model_dict) with ‘GUROBI’ attached as the solver:

The script below is the REST API implementation using Flask:

A few things to note from the two code snippets above: 

  • Although we created the REST API using Flask, you can use other REST API builders of your choice. 
  • You can access the Gurobi license parameters you had instantiated in the YAML file, you can easily retrieve them by calling os.environ method. These parameters will be stored exactly as you defined them in the envVarName field within the YAML secrets sectionbelow. 

Step 2: Defining the YAML File for Snowpark Container Services 

  • If you have a Gurobi license that you would like to include, you will need to adjust the YAML file and include the secrets field within containers:

Notes on Gurobi license: For comprehensive details regarding licensing terms and permissions, we recommend visiting this website or reaching out to Gurobi directly.  

Once you configured the YAML file, make sure to create secret objects in a Snowflake SQL worksheet. The secrets object below should correspond to each of the snowflakeSecretlocations listed in the YAML file:

Step 3: Running the API from a Snowflake UDF 

After deploying your API Service to a compute pool, you will need to encapsulate it within a Snowflake UDF for use in a Python worksheet:

Make sure that the UDF’s input and output data types match with what you had set up in the Python REST API script. Similarly, ensure that the API endpoint listed in the UDF is consistent with how you had defined it in the @app.route which in this case is /optimize.

To use the UDF above, you can simply call the UDF name in a Snowsight Python worksheet:

df[‘MODEL’] is a column which consists of the PuLP model object dictionary. For more details on how this would fully integrate with the optimization model code, you can view the run_model() method on the section below.

Building and Running Optimization Models from Snowsight Python Worksheet 

After deploying your REST API Service to a compute pool and creating a corresponding UDF, you can call this UDF from a Python worksheet in Snowsight to solve your PuLP model. Before initiating the PuLP model, please make sure that you have imported the PuLP package from the ‘Anaconda Packages’ tab, as shown in figure 2. 

Figure 2. The PuLP package is listed as one of the installed packages

Once PuLP is confirmed as one of the installed packages, proceed to create your PuLP model. A sample PuLP model is defined in the Python worksheet below.  

The build_model() method generates a Python dictionary representing the PuLP model. You can insert this model dictionary into a Snowpark session dataframe and use the call_udf Snowpark function on the column containing the model dictionary. This function retrieves the UDF you previously created for the REST API service. 

The run_model() method returns a dataframe that can be appended to an existing database, allowing you to maintain a record of your model runs over time (see Figure 3). This integration not only streamlines monitoring, tracking, and logging of your optimization models but also ensures a cohesive approach to the lifecycle management of your optimization models.  

Figure 3. A Snowflake database that keeps track of your past model runs

Ready To Learn More?

Discover introductory-level courses, games, and other resources designed for data scientists here.

Guidance for Your Journey

30 Day Free Trial for Commercial Users

Start solving your most complex challenges, with the world's fastest, most feature-rich solver.

Always Free for Academics

We make it easy for students, faculty, and researchers to work with mathematical optimization.

Try Gurobi for Free

Choose the evaluation license that fits you best, and start working with our Expert Team for technical guidance and support.

Evaluation License
Get a free, full-featured license of the Gurobi Optimizer to experience the performance, support, benchmarking and tuning services we provide as part of our product offering.
Cloud Trial

Request free trial hours, so you can see how quickly and easily a model can be solved on the cloud.

Academic License
Gurobi provides free, full-featured licenses for coursework, teaching, and research at degree-granting academic institutions. Academics can receive guidance and support through our Community Forum.

Search

Gurobi Optimization