Try our new documentation site (beta).
Filter Content By
Version
Text Search
${sidebar_list_label} - Back
Filter by Language
![next](https://www.gurobi.com/wp-content/plugins/hd_documentations/documentation/11.0/remoteservices/next.png?x70725)
![up](https://www.gurobi.com/wp-content/plugins/hd_documentations/documentation/11.0/remoteservices/up.png?x70725)
![previous](https://www.gurobi.com/wp-content/plugins/hd_documentations/documentation/11.0/remoteservices/prev.png?x70725)
Next: Using a Separate Distributed Up: Distributed Algorithms Previous: Running a Distributed Algorithm
Submitting a Distributed Algorithm as a Batch
With a Cluster Manager, you can also submit your distributed MIP and concurrent MIP as a batch using the batch solve command. Distributed tuning is not yet supported. Here is an example:
./grbcluster batch solve DistributedMIPJobs=2 misc07.mps info : Batch f1026bf5-d5cf-44c9-81f8-0f73764f674a created info : Uploading misc07.mps... info : Batch f1026bf5-d5cf-44c9-81f8-0f73764f674a submitted with job d71f3ceb...
As we can see, the model was uploaded and the batch was submitted. This creates a parent job as a proxy for the client. This job will in turn start two worker jobs because we set DistributedMIPJobs=2. This can be observed in the job history:
> grbcluster job history --length=3 JOBID BATCHID ADDRESS STATUS STIME USER OPT API PARENT d71f3ceb f1026bf5 server1:61000 COMPLETED 2019-09-23 14:17:57 jones OPTIMAL grbcluster 6212ed73 server1:61000 COMPLETED 2019-09-23 14:17:57 jones OPTIMAL d71f3ceb 63cfa00d server2:61000 COMPLETED 2019-09-23 14:17:57 jones OPTIMAL d71f3ceb
![next](https://www.gurobi.com/wp-content/plugins/hd_documentations/documentation/11.0/remoteservices/next.png?x70725)
![up](https://www.gurobi.com/wp-content/plugins/hd_documentations/documentation/11.0/remoteservices/up.png?x70725)
![previous](https://www.gurobi.com/wp-content/plugins/hd_documentations/documentation/11.0/remoteservices/prev.png?x70725)
Next: Using a Separate Distributed Up: Distributed Algorithms Previous: Running a Distributed Algorithm
![](/wp-content/plugins/hd_documentations/content/images/documentation-ampl-guide.webp?x70725)
![](/wp-content/plugins/hd_documentations/content/images/documentation-cloud.webp?x70725)
![](/wp-content/plugins/hd_documentations/content/images/documentation-example-tour.webp?x70725)
![](/wp-content/plugins/hd_documentations/content/images/documentation-quick-start.webp?x70725)
![](/wp-content/plugins/hd_documentations/content/images/documentation-reference-manuals.webp?x70725)
![](/wp-content/plugins/hd_documentations/content/images/documentation-remote-services.webp?x70725)
![](/wp-content/plugins/hd_documentations/content/images/quickstart-os-linux.webp?x70725)
![](/wp-content/plugins/hd_documentations/content/images/quickstart-os-mac-osx.webp?x70725)
![](/wp-content/plugins/hd_documentations/content/images/quickstart-os-windows.webp?x70725)