top | item 26294013

(no title)

simonblanke | 5 years ago

Okay that is interesting. You could realize that by making a restricted area in the search space by returning np.nan in the objective function for those cases. Gradient-Free-Optimizers can handle np.nan and np.inf just fine.

Maybe you could do something like:

If a+b>=1: return np.nan else: return score

discuss

order

ialyos|5 years ago

This is at a high level how one of our Google internal black box optimizers behaves. The term used is "infeasible region" but it's the same idea as using a nan.

I would caution against using nan to always mean infeasible. Instead users should catch experiments outside the feasibility region and return a special infeasible value. This will increase visibility into the behavior of the optimizer, because it leaves nan to be used for values inside the region of constraint that are still problematic (due to bugs, numerical instability, etc)

unnah|5 years ago

If the volume of feasible space is small, the optimizer may have considerable trouble finding the feasible region(s) in the domain. The simple solution is to use a penalty function instead, i.e. add a term like M*max(0, 1-(a+b)) to the objective, with sufficiently large M. If the optimum is at the boundary of the feasible region, you might still get a solution that is slightly infeasible, which can be worked around with more special case logic...

opium_tea|5 years ago

Is there a subset of these constrained optimisation techniques that work when the constraint bounds are unknown before evaluation time? This applies to certain physical engineering problems where, due to the complexity of the environmental system you're modelling, you don't necessarily know what combination of input parameters lead to a non-viable solution. That is, a given solution might be numerically valid but the objective function could determine it to be non-viable due to not satisfying engineering constraints (e.g. it vibrates too fast).

craftinator|5 years ago

Annnnnnnd you've added gradients to the gradient free optimizer. Every shovel is a hammer if try hard enough.