How to add custom constraint
My optimization problem is the following: I want to maximize the sum over binary variables. The only constraint I have is the following: I transform the ones in my binary array to corresponding indices, so e.g. [1 0 0 1 0] -> [0 3]=indices Then I use these indices to calculate a value of a matrix M[:, indices]. Now this value needs to be under a certain threshold.
I already have a good initial guess. Now if an additional zero gets flipped to a one during the optimization and the value is over the threshold it does not make any sense to further investigate this combination. So the one needs to be flipped back to the zero.
I have seen the custom lazy constraint handler about the traveling salesman problem but I still have troubles understanding the functionalities of the functions createCons, conscheck and consenfolp. I really do not know where to add my custom constraint function I use for calculating my matrix value.
Hey @dezenn ! Can you please give a bit more information on what you're trying to do? What do you mean with the value of a matrix M[:, indices]?
Hey @Joao-Dionisio for sure!
I have a matrix M of shape (output features, parameters). Let's say the amount of parameters is N. I want to maximize over a binary array set of size N. For the constraint I take the binary array which I put into a non-linear function I wrote. That's why I didn't include more information because I thought it wouldn't be relevant I'm sorry.
What this function does is the following: Consider that N=4, so 4 parameters and that my start solution is [1 0 1 0]. This binary array is put into the function and transforms it into [0 2], corresponding to the indices where the ones are. Then I consider only the first and third column of my matrix so M[:, [0 2]]. From this sub-matrix I calculate a value, basically the smallest eigenvalue, which needs to be under a threshold. This is my constraint I want to implement.
I basically try to implement an algorithm from a paper I read. If you want I can provide the literature. Also, if you want, I can provide what I tried so far but it is basically messing around with the custom constraint handler. Thank you!
Yeah, if you don't mind, I think it would be easier to look at your code and see what the end goal is. It also seems like an interesting problem :)
But it does seem like a constraint handler is what you want
# this is my custom function
def collinearity(matrix):
matrix_mult = torch.matmul(matrix.T, matrix)
all_eigenvals = torch.real(torch.linalg.eigvals(matrix_mult))
smallest_eigenval = torch.min(all_eigenvals)
return 1 / torch.sqrt(smallest_eigenval)
class CollinearityConshdlr(Conshdlr):
def __init__(self, S, threshold, xvars):
self.S = S
self.threshold = threshold
self.xvars = xvars
def createCons(self, name, variables):
model = self.model
cons = model.createCons(self, name)
cons.data = {}
cons.data['vars'] = variables
return cons
# Check feasibility of found solution
def conscheck(self, constraints, solution, check_integrality,
check_lp_rows, print_reason, completely, **results):
# Transform binary current solution to indices for matrix
indices = [i for i, var in enumerate(self.model.getVars()) if solution[var] > 0.5]
# Do not need to check any further
if len(indices) <= 1:
return {'result': SCIP_RESULT.FEASIBLE}
M = self.S[:, indices]
col = collinearity(M)
if col > threshold:
return {'result': SCIP_RESULT.INFEASIBLE}
return {'result': SCIP_RESULT.FEASIBLE}
# here I really do not know what to do, do I need my constraint again? If so, how can I get the current solution?
# I somehow want to add that when a combination is above the threshold, the branch does not need to be investigated any further.
# I am unfortunately very unfamiliar with SCIP as I usually solve continuous optimization problems
def consenfolp(self, constraints, n_useful_conss, sol_infeasible):
indices = [i for i, var in enumerate(self.xvars) if self.model.getVal(var) > 0.5]
if len(indices) <= 1:
return {'result': SCIP_RESULT.FEASIBLE}
M = self.S[:, indices]
col = collinearity(M)
if col > threshold:
cut = self.model.createConsBasicLinear(
name='nogood_col_cut',
lhs=None,
rhs=len(indices)-1
)
for i in indices:
cut.addTerm(self.xvars[i], 1.0)
self.model.addCon(cut)
return {'result': SCIP_RESULT.CONSADDED}
return {'result': SCIP_RESULT.FEASIBLE}
#def consenfops(self, *args, **kwargs):
# return {'result': SCIP_RESULT.FEASIBLE}
def conslock(self, constraint, locktype, nlockspos, nlocksneg):
pass
def solve_with_collinearity_constraint(S, threshold, initial_guess):
m, n = S.shape
model = Model('collinearity_model')
x = {}
for i in range(n):
x[i] = model.addVar(vtype='B', name=f'x_{i}')
model.setObjective(quicksum(x[i] for i in range(n)), 'maximize')
# Initial guess
sol = model.createSol()
for i in range(n):
model.setSolVal(sol, x[i], initial_guess[i])
model.addSol(sol)
conshdlr = CollinearityConshdlr(S, threshold, x)
model.includeConshdlr(
conshdlr,
name='collinearity',
desc='collinearity threshold constraint',
sepapriority=0,
enfopriority=-10,
chckpriority=-10,
needscons=False
)
cons = conshdlr.createCons('no_subtour_cons', x)
model.addPyCons(cons)
model.optimize()
sol = model.getBestSol()
if sol is None:
print('No feasible solution')
return None
chosen = [i for i in range(n) if sol[x[i]] > 0.5]
print('Selected columns:', chosen)
print('Objective value:', model.getObjVal())
print(f'Collinearity: {collinearity(normalized_jac[:, chosen])}\n')
return chosen
if __name__=='__main__':
threshold = 20
load_from = '..\\adress_of_initial_guess\\'
# Load the initial guess
with open(load_from + 'initial_guess.pkl', 'rb') as fp:
initial_guess = pickle.load(fp)
N = len(initial_guess)
# Load the Matrix
with open(load_from + 'matrix.pkl', 'rb') as fp:
matrix= pickle.load(fp)
indices = [i for i, val in enumerate(initial_guess) if val > 0.5]
print(f'Initial solution: {indices}')
print(f'Objective value: {initial_guess.sum()}')
print(f'Collinearity: {collinearity(matrix[:, indices])}\n')
solve_with_collinearity_constraint(matrix, threshold, initial_guess)
The How to add... pages on SCIP are quite helpful, I often go there. While there are c-SCIP specific things, it also gives some perspective on how things work. In the Consenfolp section of the constraint handler, we get the following:
So it's actually here that the constraint should be added, not in conscheck, which should just check the feasibility of the solution. Looking at the source code, we see that one of the attributes of consenfolp is solinfeasible, which I assume to be the solution that would allow you to add a constraint forbidding the combination you want.
Do you think this can help you? Let me know how it's going!
The tsp_lazy is in the unfinished examples, and it has some mistakes; at least I can't run it. As a rule of thumb, the examples that are unfinished may be helpful for getting familiar with the code, but they are indeed unfinished.
As a fun fact, it was my advisor who added the majority of examples here in the repo :)
@dezenn,
Is the question still relevant? If so, based on From this sub-matrix I calculate a value, basically the smallest eigenvalue, which needs to be under a threshold. This is my constraint I want to implement, is the form of your specific cut something like this: 1 / torch.sqrt(smallest_eigenval) <= threshold?
@Joao-Dionisio,
Is there any way to pass torch methods as a variable in SCIP?