Lidé

Ing. Tomáš Dlask, Ph.D.

Všechny publikace

Activity propagation in systems of linear inequalities and its relation to block-coordinate descent in linear programs

  • DOI: 10.1007/s10601-023-09349-0
  • Odkaz: https://doi.org/10.1007/s10601-023-09349-0
  • Pracoviště: Strojové učení
  • Anotace:
    We study a constraint propagation algorithm to detect infeasibility of a system of linear inequalities over continuous variables, which we call activity propagation. Each iteration of this algorithm chooses a subset of the inequalities and if it infers that some of them are always active (i.e., always hold with equality), it turns them into equalities. We show that this algorithm can be described as chaotic iterations and its fixed points can be characterized by a local consistency, in a similar way to traditional local consistency methods in CSP such as arc consistency. Via complementary slackness, activity propagation can be employed to iteratively improve a dual-feasible solution of large-scale linear programs in a primal-dual loop – a special case of this method is the Virtual Arc Consistency algorithm by Cooper et al. As our second contribution, we show that this method has the same set of fixed points as block-coordinate descent (BCD) applied to the dual linear program. While BCD is popular in large-scale optimization, its fixed points need not be global optima even for convex problems and a succinct characterization of convex problems optimally solvable by BCD remains elusive. Our result may open the way for such a characterization since it allows us to characterize BCD fixed points in terms of local consistencies.

Super-Reparametrizations of Weighted CSPs: Properties and Optimization Perspective

  • DOI: 10.1007/s10601-023-09343-6
  • Odkaz: https://doi.org/10.1007/s10601-023-09343-6
  • Pracoviště: Strojové učení
  • Anotace:
    The notion of reparametrizations of Weighted CSPs (WCSPs) (also known as equivalence-preserving transformations of WCSPs) is well-known and finds its use in many algorithms to approximate or bound the optimal WCSP value. In contrast, the concept of super-reparametrizations (which are changes of the weights that keep or increase the WCSP objective for every assignment) was already proposed but never studied in detail. To fill this gap, we present a number of theoretical properties of super-reparametrizations and compare them to those of reparametrizations. Furthermore, we propose a framework for computing upper bounds on the optimal value of the (maximization version of) WCSP using super-reparametrizations. We show that it is in principle possible to employ arbitrary (under some technical conditions) constraint propagation rules to improve the bound. For arc consistency in particular, the method reduces to the known Virtual AC (VAC) algorithm. We implemented the method for singleton arc consistency (SAC) and compared it to other strong local consistencies in WCSPs on a public benchmark. The results show that the bounds obtained from SAC are superior for many instance groups.

Classes of linear programs solvable by coordinate-wise minimization

  • DOI: 10.1007/s10472-021-09731-9
  • Odkaz: https://doi.org/10.1007/s10472-021-09731-9
  • Pracoviště: Strojové učení
  • Anotace:
    Coordinate-wise minimization is a simple popular method for large-scale optimization. Unfortunately, for general (non-differentiable and/or constrained) convex problems, its fixed points may not be global minima. We present two classes of linear programs (LPs) that coordinate-wise minimization solves exactly. We show that these classes subsume the dual LP relaxations of several well-known combinatorial optimization problems and the method finds a global minimum with sufficient accuracy in reasonable runtimes. Moreover, we experimentally show that the method frequently yields good suboptima or even optima for sparse LPs where optimality is not guaranteed in theory. Though the presented problems can be solved by more efficient methods, our results are theoretically non-trivial and can lead to new large-scale optimization algorithms in the future.

Bounds on Weighted CSPs Using Constraint Propagation and Super-Reparametrizations

  • Autoři: Ing. Tomáš Dlask, Ph.D., doc. Ing. Tomáš Werner, Ph.D., de Givry, S.
  • Publikace: 27th International Conference on Principles and Practice of Constraint Programming. Schloss Dagstuhl--Leibniz-Zentrum fuer Informatik, 2021. ISSN 1868-8969. ISBN 978-3-95977-211-2.
  • Rok: 2021
  • DOI: 10.4230/LIPIcs.CP.2021.23
  • Odkaz: https://doi.org/10.4230/LIPIcs.CP.2021.23
  • Pracoviště: Strojové učení
  • Anotace:
    We propose a framework for computing upper bounds on the optimal value of the (maximization version of) Weighted CSP (WCSP) using super-reparametrizations, which are changes of the weights that keep or increase the WCSP objective for every assignment. We show that it is in principle possible to employ arbitrary (under certain technical conditions) constraint propagation rules to improve the bound. For arc consistency in particular, the method reduces to the known Virtual AC (VAC) algorithm. Newly, we implemented the method for singleton arc consistency (SAC) and compared it to other strong local consistencies in WCSPs on a public benchmark. The results show that the bounds obtained from SAC are superior for many instance groups.

A Class of Linear Programs Solvable by Coordinate-Wise Minimization

  • DOI: 10.1007/978-3-030-53552-0_8
  • Odkaz: https://doi.org/10.1007/978-3-030-53552-0_8
  • Pracoviště: Strojové učení
  • Anotace:
    Coordinate-wise minimization is a simple popular method for large-scale optimization. Unfortunately, for general (non-differentiable and/or constrained) convex problems it may not find global minima. We present a class of linear programs that coordinate-wise minimization solves exactly. We show that dual LP relaxations of several well-known combinatorial optimization problems are in this class and the method finds a global minimum with sufficient accuracy in reasonable runtimes. Moreover, for extensions of these problems that no longer are in this class the method yields reasonably good suboptima. Though the presented LP relaxations can be solved by more efficient methods (such as max-flow), our results are theoretically non-trivial and can lead to new large-scale optimization algorithms in the future.

Bounding Linear Programs by Constraint Propagation: Application to Max-SAT

  • DOI: 10.1007/978-3-030-58475-7_11
  • Odkaz: https://doi.org/10.1007/978-3-030-58475-7_11
  • Pracoviště: Strojové učení
  • Anotace:
    The Virtual Arc Consistency (VAC) algorithm by Cooper et al. is a soft local consistency technique that computes, in linear space, a bound on the basic LP relaxation of the Weighted CSP (WCSP). We generalize this technique by replacing arc consistency with a (problem-dependent) constraint propagation in a system of linear inequalities over the reals. When propagation detects infeasibility, the infeasibility certificate (a solution to the alternative system in Farkas’ lemma) provides a dual improving direction. We illustrate this approach on the LP relaxation of Weighted Max-SAT. We show in experiments that the obtained bounds are often not far from global LP optima and we prove that they are exact for known tractable subclasses of Weighted Max-SAT.

On Coordinate-Wise Minimization Applied to General Convex Optimization Problems

  • Autoři: Ing. Tomáš Dlask, Ph.D.,
  • Publikace: Knowledge-Based and Intelligent Information & Engineering Systems: Proceedings of the 24th International Conference KES2020. Amsterdam: Elsevier B.V., 2020. p. 1328-1337. Procedia Computer Science. vol. 176. ISSN 1877-0509.
  • Rok: 2020
  • DOI: 10.1016/j.procs.2020.09.142
  • Odkaz: https://doi.org/10.1016/j.procs.2020.09.142
  • Pracoviště: Strojové učení
  • Anotace:
    In this paper, we theoretically analyze principal properties of coordinate-wise minimization and we answer a few open questions related to it. In particular, we show that for minimizing any differentiable convex function on a given convex polyhedron, there exists a finite set of directions determined only by the polyhedron such that any local minimum with respect to these directions is a global minimum. We prove that the set of directions has a ‘nice’ structure for polyhedra defined by a k-regular matrix, which is a generalization of total unimodularity. We proceed to show that for some simple polyhedra, the number of these directions is polynomial, which subsumes already existing algorithms. We prove that the main result can not be extended for the case when the polyhedron is not known in advance or if the optimization is performed over a general convex set.

On Relation Between Constraint Propagation and Block-Coordinate Descent in Linear Programs

  • DOI: 10.1007/978-3-030-58475-7_12
  • Odkaz: https://doi.org/10.1007/978-3-030-58475-7_12
  • Pracoviště: Strojové učení
  • Anotace:
    Block-coordinate descent (BCD) is a popular method in large-scale optimization. Unfortunately, its fixed points are not global optima even for convex problems. A succinct characterization of convex problems optimally solvable by BCD is unknown. Focusing on linear programs, we show that BCD fixed points are identical to fixed points of another method, which uses constraint propagation to detect infeasibility of a system of linear inequalities in a primal-dual loop (a special case of this method is the Virtual Arc Consistency algorithm by Cooper et al.). This implies that BCD fixed points are global optima iff a certain propagation rule decides feasibility of a certain class of systems of linear inequalities.

Relative Interior Rule in Block-Coordinate Descent

  • DOI: 10.1109/CVPR42600.2020.00758
  • Odkaz: https://doi.org/10.1109/CVPR42600.2020.00758
  • Pracoviště: Strojové učení
  • Anotace:
    It is well-known that for general convex optimization problems, block-coordinate descent can get stuck in poor local optima. Despite that, versions of this method known as convergent message passing are very successful to approximately solve the dual LP relaxation of the MAP inference problem in graphical models. In attempt to identify the reason why these methods often achieve good local minima, we argue that if in block-coordinate descent the set of minimizers over a variable block has multiple elements, one should choose an element from the relative interior of this set. We show that this rule is not worse than any other rule for choosing block-minimizers. Based on this observation, we develop a theoretical framework for block-coordinate descent applied to general convex problems. We illustrate this theory on convergent message-passing methods.

Unit Propagation by Means of Coordinate-Wise Minimization

  • Autoři: Ing. Tomáš Dlask, Ph.D.,
  • Publikace: Machine Learning, Optimization, and Data Science. Basel: Springer Nature Switzerland AG, 2020. p. 688-699. LNCS. vol. 12565. ISSN 0302-9743. ISBN 978-3-030-64582-3.
  • Rok: 2020
  • DOI: 10.1007/978-3-030-64583-0_60
  • Odkaz: https://doi.org/10.1007/978-3-030-64583-0_60
  • Pracoviště: Strojové učení
  • Anotace:
    We present a novel theoretical result concerning the applicability of coordinate-wise minimization on the dual problem of linear programming (LP) relaxation of weighted partial Max-SAT that shows that every fixed point of this procedure defines a feasible primal solution. In addition, this primal solution corresponds to the result of a certain propagation rule applicable to weighted Max-SAT. Moreover, we analyze the particular case of LP relaxation of SAT and observe that coordinate-wise minimization on the dual problem resembles unit propagation and also has the same time complexity as a naive unit propagation algorithm. We compare our theoretical results with max-sum diffusion which is a coordinate-wise minimization algorithm that is used to optimize the dual of the LP relaxation of the Max-Sum problem and can in fact perform a different kind of constraint propagation, namely deciding whether a given constraint satisfaction problem (CSP) has non-empty arc consistency closure.

Za stránku zodpovídá: Ing. Mgr. Radovan Suk