Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The DW bound and Best Dual did not converge #253

Open
phumthep opened this issue Jan 18, 2023 · 10 comments
Open

The DW bound and Best Dual did not converge #253

phumthep opened this issue Jan 18, 2023 · 10 comments

Comments

@phumthep
Copy link

phumthep commented Jan 18, 2023

I am working on a unit commitment/energy dispatch model called PowNet. The model has 26.8k variables and 100k constraints. I want to exploit the structure of the problem by using the Dantzig-Wolfe decomposition to decrease the solve time.

I got the decomposition file using the GCG solver (without presolved) which resulted in 1292 subproblems and a master problem with 2832 constraints. It appeared that the DW bound and the Best Dual are not converging. The DSP solver terminated with the following information:

Iteration  68: DW Bound +1.797693e+308, Best Dual +1.045458e+07 (gap Large %), nrows 11717, ncols 4124, timing (total 35516.60, master 4234.84, gencols 31275.72), statue 3000
Unexpected solution status: 3013 in DwBundleDual::solveMaster
Error code 1 in /home/phumthep/DSP/src/Solver/DantzigWolfe/DwMaster.cpp:549
Unexpected solution status: 3013.
Alps0240I Proc: 0, Part: 0, Cand: 0, Best N: 1e+75, Best S: 1e+75
Status: 3000
Primal Bound: 1e+75
Dual Bound  : 1e+75

Note: I ran the DSP using Gurobi as the underlying solver. Gurobi could solve this problem in five minutes and the objective value was 2.419e+07.

@phumthep phumthep changed the title The DW bound does not converge The DW bound and Best Dual did not converge Jan 18, 2023
@kibaekkim
Copy link
Collaborator

First I have a few observations:

  • It looks like you are solving the subproblems in serial (not in parallel). If you are not sure what I am talking about, it's likely that you are not running in parallel. Parallel solutions can possibly reduce the solution time for gencols.
  • Even if you are running it in parallel, I see that master also took a lot of time. In fact, this is something that needs improvement from the DSP algorithm side in research perspective. There are some parameters we can try to tune. Did you change any parameters from the default values?
  • Unfortunately, the error code does not give us much information about what happened. Can you set the parameter value LOG_LEVEL to 3 or 5? You can also try to set DW/MASTER/SOLVER/LOG_LEVEL to a nonzero value.

It would be great if you can share with us:

  • your command line with arguments
  • parameters, if you change any
  • entire log information from the beginning
  • instance files, if possible

@phumthep
Copy link
Author

Thanks for the suggestions. I did a quick experiment doing both serial and parallel runs. For the parallel run, I used 4 threads. I did not tune any DSP parameters, so I think this is something I might try.

For parallel run, my command line argument was
nohup mpiexec -n 4 runDsp --algo dw --mps th_model.mps --dec th_model_np1.dec > np1_n4_output.log 2> np1_n4_error.log &

Please see below for the files
(1) log files - The log shows the solver encountered many infeasible solutions. Should this be normal?
logfiles.zip

(2) instance files - One mps file and one dec file. I got the decomposition from GCG without using the pre-solve function. The 1292 subproblems can be classified into eight types and they make sense to me.
model_instance.zip
image

@phumthep
Copy link
Author

phumthep commented Feb 2, 2023

Thanks for the suggestions. I did a quick experiment doing both serial and parallel runs. For the parallel run, I used 4 threads. I did not tune any DSP parameters, so I think this is something I might try.

For parallel run, my command line argument was nohup mpiexec -n 4 runDsp --algo dw --mps th_model.mps --dec th_model_np1.dec > np1_n4_output.log 2> np1_n4_error.log &

Please see below for the files (1) log files - The log shows the solver encountered many infeasible solutions. Should this be normal? logfiles.zip

(2) instance files - One mps file and one dec file. I got the decomposition from GCG without using the pre-solve function. The 1292 subproblems can be classified into eight types and they make sense to me. model_instance.zip image

I could use DSP to solve a decomposition which has no constraint in the master problem. This decomposition has 155 blocks, and almost all of the constraints is pushed into Block 1. Block 2 to 155 only have one constraint per block. Since all the constraints are located in Block 1, I don't think this is an optimal decomposition.

Other decompositions will not converge because the master problem appears to be infeasible. The dual of the master problem is unbounded. Would you be able to advise what I could do @kibaekkim ?

@hideakiv
Copy link
Collaborator

hideakiv commented Feb 2, 2023

Hi @phumthep, I have observed the issue with the master problem being infeasible. Would you be able to create a small instance of this problem so that we could understand the issue better?

@phumthep
Copy link
Author

phumthep commented Feb 10, 2023

Hi @hideakiv. Sorry for the wait. I have been running some preliminary experiments with a dummy problem (smaller instance). I think there might be something weird going on with DSP.

I have created a dummy problem as seen in the figure below. The power system contains four generators (3 thermo-powerplants and 1 hydropower). The two types of power plants behave the same except there is no ramping with hydropower. There are three substations. Node1 and Node2 have demand. Node3 has no demand and acts as a relay point. The dummy model contains 700 variables and 1986 constraints.

image

I have decomposed the problem into different structures using GCG (not using the pre-solve function). I was able to solve the decompositions just fine with the GCG. However, I get "Infeasible model. The primal master could not be solved to optimality" with DSP. Some other decompositions produce "No primary ray" error. I have provided the instance with two compositions representing both types of errors in a zip file below.

dummy_model.zip

Summary of a decomposition structure ("dummy_o1107_infeasible.dec") is shown in the figure below. The figure explains what are the blocks which will be the subproblems in the DW. There are 56 blocks and 7 types of blocks.

image

I have also collected some stats which might be useful.

image

Please let me know if you need more details!

@kibaekkim
Copy link
Collaborator

@phumthep: Thanks for the detailed information. Just to make sure that we are on the same page, which repository (or DSP version) are you using?

@phumthep
Copy link
Author

@kibaekkim I checked my DSP and the version is Version 1.5.2. This should be the latest one right?

@kibaekkim
Copy link
Collaborator

@kibaekkim I checked my DSP and the version is Version 1.5.2. This should be the latest one right?

Thanks for the quick response. We will get back to you after running your instance.

@kibaekkim
Copy link
Collaborator

@phumthep I can confirm that something must be going on. I get something different from your observation.

First of all, I was able to load your problem instances and solve them by using "deterministic equivalent formulation" with CPLEX. This can be done something like:

./bin/runDsp --algo de --mps ./issue253/dummy_model.mps --dec ./issue253/dummy_o1107_infeasible.dec
./bin/runDsp --algo de --mps ./issue253/dummy_model.mps --dec ./issue253/dummy_o596_noRay.dec

Both found the objective value of 3.79398e+07. Solving *.mps with CPLEX gives the same objective value. And finding the same objective value is expected. So reading the problem instances looks OK.

Running DW, I indeed got unexpected results.

Thanks for sharing with us the problem instances. We will look into the issues and keep you updated here.

@kibaekkim
Copy link
Collaborator

kibaekkim commented Feb 15, 2023

@phumthep I have one update. The file dummy_o596_noRay.dec creates the decomposition with coupling variables. This is not a form of Dantzig-Wolfe decomposition (dw in DSP). The current implementation does not detect whether the decomposition file results in coupling variables or not. #256 will address this in future.

I will keep looking into the other instance.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants