**The 17th workshop of NordConsNet, the Nordic Network for researchers and practitioners of Constraint Programming**

The Nordic Network for researchers and practitioners of Constraint programming (NordConsNet) kindly invites you to participate in the yearly NordConsNet Workshop. The purpose of the workshop is to learn about ongoing research in Constraint Programming and optimisation, existing projects and products, and further development of the network. NordConsNet Workshop 2018 is the 17th edition of this annual event.

The workshop will take place at the **Palace, Göteborg, Södra Hamngatan 2**, on **Tuesday 29 May 2018** see the webpage Palace. The venue is located 5 minutes walking distance from the main railway station. The conference room is located on the 2nd floor.

To register please send mail to Waldemar Kocjan. Registration including lunch and catering must be send prior to ** May, 19.** If you register after that date, you are still very welcome to attend any talks during the day, just be mindful not to enter during a talk and don't expect catering.

Please forward this information to anyone who might be interested in this workshop but is not yet on the NordConsNet mailing list: they can subscribe to it by applying to Justin Pearson.

The NordConsNet Workshop 2018 is organised and sponsored by Jeppesen Systems. Please send any questions regarding the workshop to Waldemar Kocjan at Jeppesen.

Time | Presenter | Title | Presentation |
---|---|---|---|

10:00-10:05 | Waldemar Kocjan (Jeppesen) | Welcome | - |

10:05-10.35: | Mattias Grönkvist (Jeppesen) | Constraint Programming and Optimization at Jeppesen | - |

10:35-11:05 | Emily Curry (Chalmers/Jeppesen) | Alternative Pricing for Rostering Column Generation | |

11:05-11:30 | Fika | Tea & Coffee | - |

11:30-12:00 | Dag Wedelin (Chalmers) | On the in-the-middle algorithm and heuristic and some of its properties | |

12:00-12:30 | Gustav Björdal (Uppsala University) | Declarative Local-Search Neighbourhoods in MiniZinc | |

12:30-13:00 | Helge Spieker (Simula Research Lab, Norway) | Estimating Objective Boundaries for Constraint Optimization Problems | |

13:00-14:00 | Lunch | Food | - |

14:00-14:30 | Christian Schulte (KTH/RISE) | GECODE: Generic Constraint Development Environment | |

14:30-15:00 | Linnea Ingmar (Uppsala University) | Making Compact-Table Compact | |

15:00-15:30 | Christos Dimitrakakis (Chalmers) | DUCT: An Upper Confidence Bound Approach to Distributed Constraint Optimization Problems | |

15:30-15:50 | Fika | Tea & Coffee | - |

15:50-16:20 | Stephan Gocht (KTH) | Understanding Conflict-Driven Clause Learning SAT Solvers Through the Lens of Proof Complexity | |

16:20-16:50 | Jan Elffers (KTH) | Divide and Conquer: Towards Faster Pseudo-Boolean Solving | |

16:50 | Closing | - |

Three different methods for solving the black-box pricing problem have been implemented. The first method uses binary particle swarm optimization (BPSO) to search for improving rosters. The other two methods use surrogate modeling to fit a nonlinear surrogate function to a set of sampled rosters using radial basis functions. The surrogate function was then either linearly approximated, so that a shortest path problem could be set up and solved, or solved heuristically by a BPSO method.

The three methods have been evaluated on five real-world test cases. For each test case, a large number of different pricing problems are solved. Our comparison of the methods' performance shows that the method using BPSO performed the best, followed by the surrogate modeling approach without the linear approximation.

MiniZinc is a solver-independent modelling language, supported by CP, MIP, SAT, SMT, and (constraint-based) local search (CBLS) backends. Some solving technologies, in particular CP and CBLS, require not only a model but also a search strategy. While backends for these technologies offer default (autonomous) search strategies, it is often beneficial to be able to include in a model a user-specified search strategy for a particular solving technology, especially if the search strategy can encapsulate knowledge about the structure of the problem. This is complex since a local-search strategy is often tightly tied to the model. Hence we wish to use the same language for specifying the model and the local search.

In this presentation, I will show how we extended MiniZinc so that one can attach a fully declarative neighbourhood specification to a model, while maintaining the solver-independence of the language.

The talk provides an overview of what Gecode is, what one can do with it, and what are the basic ideas behind its architecture.

The presentation introduces techniques to make compact-table an excellent fit for a copying solver. The key is to make sparse bit-sets dynamically compact (only their essential parts occupy memory and their implementation is dynamically adapted during search) and tables shared (their read-only parts are shared among copies). We evaluate our techniques on top of Gecode as a copying solver. Dynamically compact bit-sets reduce peak memory by 7.5% and runtime by 13.6% on average and by up to 66.3% and 33.2%. Shared tables even further reduce runtime and memory usage. The reduction in runtime exceeds the reduction in memory and a cache analysis indicates that our techniques might also be beneficial for trailing solvers. The proposed implementation runs on average almost an order of magnitude faster than Gecode's original implementations while using half the memory.

Experimentally, we show that DUCT matches the optimal solution found by the well-known DPOP and O-DPOP algorithms on moderate-size problems, while always requiring less agent communication. For larger problems, where DPOP fails, we show that DUCT produces significantly better solutions than local, incomplete algorithms. Overall we believe that DUCT is a practical, scalable algorithm for complex DCOPs.

Resolution, the proof system underlying CDCL, is well understood from a proof complexity point of view. Therefore, we use theoretical benchmarks, which have the advantage that they are a) scalable, b) extremal with respect to different theoretical properties and c) theoretically easy in the sense that they have short resolution proofs. This allows studying how efficient the proof search of a solver is.

In this work we evaluate different solver configurations on theoretical benchmarks, which allows us for the first time to relate the performance of these different solver configurations to theoretical properties of the benchmarks and raises a number of intriguing questions for further research. br>

We propose a modified approach to pseudo- Boolean solving, using division instead of the saturation rule in [Chai and Kuehlmann '05] and other PB solvers. In addition to resulting in a stronger conflict analysis, this also keeps integer coefficient sizes down, which further improves performance. Together with some other optimizations, this yields a solver that performed very well in the Pseudo- Boolean Competitions 2015 and 2016, and that has speed approaching that of CDCL-based methods measured in terms of number of conflicts per second.