Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Incorrect handling of constraints with identical variables causing infeasible models #396

Open
ItsNiklas opened this issue Nov 11, 2024 · 0 comments · May be fixed by #397
Open

Incorrect handling of constraints with identical variables causing infeasible models #396

ItsNiklas opened this issue Nov 11, 2024 · 0 comments · May be fixed by #397

Comments

@ItsNiklas
Copy link

Describe the bug
There is a bug where the LinExpr constructor incorrectly handles expressions involving identical variables, leading to wrong/infeasible models. This happens when a constraint contains variables that are the same object, causing the coefficients to be overwritten during expression construction by the dict constructor itself.

Importantly, all Var.__add__, Var.__sub__, Var.__rsub__, Var.__eq__, Var.__le__, and Var.__ge__ can easily trigger this issue when both operands are Var.

>>> m = Model(); x = m.add_var('x')
>>> m += x - x >= 0; print(m.constrs[0])
constr(0): -1.0 x >= -0.0

For example, the substraction override calls the constructor with [1, -1]. If the variables are the same object, the dict(zip()) constructor will void the first coefficient!

Why are the variables the same object? This can easily happen in problems where you have to specify any variable as a fixed point, for example order-based problems. As soon as you build constraints in a loop and you forget to skip the case where the variables are the same, you have a faulty constraint.

For example, where I first encountered the bug:
Partial Ordering Graph Coloring uses a constraint like this: https://arxiv.org/pdf/1706.10191 (p. 6). Constraint (21) has exactly this situation where, if you are not careful and v == q, python-mip breaks.

Additional context
The issue arises after a change made in commit f9e2046, which removed some variable cleanup by avoiding the self.add_var call. This cleanup was crucial to avoid the coefficients being overwritten when dealing with duplicate variables in a constraint.

To Reproduce
Minimal working example:

from mip import *
m = Model()
x = m.add_var()
m += x >= 1
m += x - x >= 0
m.optimize()
Welcome to the CBC MILP Solver 
Version: devel 
Build Date: Nov  8 2024
Starting solution of the Linear programming problem using Dual Simplex

Coin0507I Presolve determined that the problem was infeasible with tolerance of 1e-08
Clp3003W Analysis indicates model infeasible or unbounded
Clp0001I Primal infeasible - objective value 0
Clp0032I PrimalInfeasible objective 0 - 1 iterations time 0.002
<OptimizationStatus.INFEASIBLE: 1>

These will result in the model being reported as infeasible, even though the constraints are valid. This is because the second constraint got parsed as -x >= 0.

Basically, it can be triggered manually by constructing the expression, or with the faulty Var overrides, and any variable type:

'Python 3.13.0 (main, Oct  8 2024, 00:00:00) [GCC 14.2.1 20240912 (Red Hat 14.2.1-3)] on linux'
>>> from mip import *
>>> m = Model(); x = m.add_var('x')
>>> m += x + x <= 1
>>> m += x - x <= 1
>>> m += x == x
>>> m += x <= x
>>> m += x >= x
>>> list(map(str, m.constrs))
['constr(0): +1.0 x <= 1.0', 'constr(1): -1.0 x <= 1.0', 'constr(2): -1.0 x = -0.0', 'constr(3): -1.0 x <= -0.0', 'constr(4): -1.0 x >= -0.0']

Also, I have read #163, and it seems like the speed improvements did not come without side effects.

Python-MIP version: Since f9e2046 (Sep. 2021) (>=1.14.0) (via git-bisect).

ItsNiklas added a commit to ItsNiklas/python-mip that referenced this issue Nov 11, 2024
@ItsNiklas ItsNiklas linked a pull request Nov 11, 2024 that will close this issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant