projects

Here are some things I have been working on.

  • Learning to Cut by Looking Ahead: Cutting Plane Selection via Imitation Learning
    Joint with Max B. Paulus, Andreas Krause, Laurent Charlin and Chris J. Maddison. ICML 2022.
    [paper]
    We propose a new neural architecture (NeuralCut) for cut selection in Mixed-Integer Linear Programming (MILP). We train our model by imitation of a novel “lookahead” expert rule, which explicitly measures which cuts yield the best bound improvement. Our model outperforms standard baselines for cut selection on several synthetic MILP benchmarks and shows its full potential when deployed in a realistic MILP solver.

  • A Classifier to Decide on the Linearization of MIQPs in CPLEX +
    Learning a Classification of Mixed-Integer Quadratic Programming Problems
    Joint with Pierre Bonami and Andrea Lodi. Operations Research 2022 + CPAIOR 2018.
    [code] [2022 paper] [2018 paper]
    We translate the algorithmic question of whether to linearize convex Mixed-Integer Quadratic Programming problems (MIQPs) into a classification task, and use machine learning techniques to tackle it. We represent MIQPs and the linearization decision by careful target and feature engineering, designing learning experiments and evaluation metrics. As a practical result, we deploy a SVM regression+classification pipeline deciding on MIQP linearization in the IBM-CPLEX 12.10.0 commercial solver.

  • Parameterizing Branch-and-Bound Search Trees to Learn Branching Policies
    Joint with Jason Jo, Andrea Lodi and Yoshua Bengio. AAAI 2021.
    [code] [paper] [poster]
    We tackle variable selection in Branch-and-Bound (B&B) for Mixed-Integer Linear Programming (MILP) problems, seeking policies that generalize across heterogeneous MILPs. We develop new parameterizations of the candidate variables and of the search trees, and design DNN architectures to handle them. Our results show that incorporating a search-tree context to modulate branching aids generalization, with better test accuracy and smaller B&B trees.

  • Learning MILP Resolution Outcomes Before Reaching Time-Limit
    Joint with Martina Fischetti and Andrea Lodi. CPAIOR 2019.
    [code] [paper]
    Looking at the evolution of a partial branch-and-bound (B&B) tree for a MILP instance, we aim to predict whether the problem will be solved to proven optimality before timing out. We exploit machine learning tools, and summarize time-series information from the MILP resolution process to cast a prediction within a classification framework. Experiments show that a valuable statistical pattern can indeed be learned during B&B optimization, with key predictive features reflecting the know-how and experience of field’s practitioners.

  • On learning and branching: a survey
    Joint with Andrea Lodi. TOP 2017.
    [paper]
    We survey learning techniques to deal with the two most crucial decisions in the branch-and-bound algorithm for Mixed-Integer Linear Programming, namely variable and node selections. We interpret some early contributions in the light of modern learning techniques, and give the details of the recent algorithms that instead explicitly incorporate machine learning paradigms.