Publications, papers, surveys…

Here I list my papers. Their copyrights belongs to the publisher. All papers locally stored on this page are preprints, are essentially equivalent to the published versions, and can be only downloaded for personal or academic purposes. Some of my papers are also indexed and stored at the following locations

With a modern browser, any math should render properly: if \(e^x\) shows as $e^x$ then your browser does not have enough capabilities to render math.

Journal

Circular (Yet Sound) Proofs in Propositional Logic (2023).
Atserias, Albert and Lauria, Massimo.
In: ACM Trans. Comput. Logic. 24(3):1--20.
Conference version published at MFCS, 2022.
Show Abstract
Abstract

Proofs in propositional logic are typically presented as trees of derived formulas or, alternatively, as directed acyclic graphs of derived formulas. This distinction between tree-like vs. dag-like structure is particularly relevant when making quantitative considerations regarding, for example, proof size. Here we analyze a more general type of structural restriction for proofs in rule-based proof systems. In this definition, proofs are directed graphs of derived formulas in which cycles are allowed as long as every formula is derived at least as many times as it is required as a premise. We call such proofs “circular”. We show that, for all sets of standard inference rules with single or multiple conclusions, circular proofs are sound. We start the study of the proof complexity of circular proofs at Circular Resolution, the circular version of Resolution. We immediately see that Circular Resolution is stronger than Dag-like Resolution since, as we show, the propositional encoding of the pigeonhole principle has circular Resolution proofs of polynomial size. Furthermore, for derivations of clauses from clauses, we show that Circular Resolution is, surprisingly, equivalent to Sherali-Adams, a proof system for reasoning through polynomial inequalities that has linear programming at its base. As corollaries we get: 1) polynomial-time (LP-based) algorithms that find Circular Resolution proofs of constant width, 2) examples that separate Circular from Dag-like Resolution, such as the pigeonhole principle and its variants, and 3) exponentially hard cases for Circular Resolution. Contrary to the case of Circular Resolution, for Frege we show that circular proofs can be converted into tree-like proofs with at most polynomial overhead.

On vanishing sums of roots of unity in polynomial calculus and sum-of-squares (2023).
Bonacina, Ilario, Galesi, Nicola and Lauria, Massimo.
In: Computational Complexity. 32(2):12.
Show Abstract
Abstract

We introduce a novel take on sum-of-squares that is able to reason with complex numbers and still make use of polynomial inequalities. This proof system might be of independent interest since itallows to represent multivalued domains both with Boolean and Fourier encoding. We show degree and size lower bounds in this system for a natural generalization of knapsack: the vanishing sums of roots of unity. These lower bounds naturally apply to polynomial calculus as well.

Verification and generation of unrefinable partitions (2023).
Riccardo Aragona, Lorenzo Campioni, Roberto Civino and Massimo Lauria.
In: Information Processing Letters. 181(--):--.
Show Abstract
Abstract

Unrefinable partitions are a subset of partitions into distinct parts which satisfy an additional unrefinability property. More precisely, no parts of such partitions can be written as the sum of different integers which are not parts. We address in this paper the algorithmic aspects related to unrefinable partitions, such as testing whether a given partition is unrefinable or not and enumerating all the partitions whose sum is a given number. We design two algorithms to solve the two mentioned problems and we discuss their complexity.

On the maximal part in unrefinable partitions of triangular numbers (2022).
Riccardo Aragona, Lorenzo Campioni, Roberto Civino and Massimo Lauria.
In: Aequationes Mathematicae. 96(6):1339--1363.
Show Abstract
Abstract

A partition into distinct parts is refinable if one of its parts $a$ can be replaced by two different integers which do not belong to the partition and whose sum is $a$, and it is unrefinable otherwise. Clearly, the condition of being unrefinable imposes on the partition a non-trivial limitation on the size of the largest part and on the possible distributions of the parts. We prove a $O(n^{1/2})$-upper bound for the largest part in an unrefinable partition of $n$, and we call maximal those which reach the bound. We show a complete classification of maximal unrefinable partitions for triangular numbers, proving that if $n$ is even there exists only one maximal unrefinable partition of $n(n+1)/2$, and that if $n$ is odd the number of such partitions equals the number of partitions of $\lceil n/2 \rceil$ into distinct parts. In the second case, an explicit bijection is provided.

Clique Is Hard on Average for Regular Resolution (2021).
Albert Atserias, Ilario Bonacina, Susanna F. de Rezende, Massimo Lauria, Jakob Nordström and Alexander A. Razborov.
In: Journal of the ACM. 68(4):23:1--23:26.
Conference version published at STOC, 2018.
Show Abstract
Abstract

We prove that for $k \ll \sqrt[4]{n}$ regular resolution requires length $n^{\Omega(k)}$ to establish that an Erdős-Rényi graph with appropriately chosen edge density does not contain a $k$-clique. This lower bound is optimal up to the multiplicative constant in the exponent, and also implies unconditional $n^{\Omega(k)}$ lower bounds on running time for several state-of-the-art algorithms for finding maximum cliques in graphs.

Upper bounds on positional Paris–Harrington games (2021).
Lorenzo Carlucci and Massimo Lauria.
In: Discrete Mathematics. 344(3):112257.
Show Abstract
Abstract

We give upper bounds for a positional game — in the sense of Beck — based on the Paris—Harrington Theorem for bi-colorings of graphs and uniform hypergraphs of arbitrary dimension. The bounds for the positional game show a striking difference when compared to the bounds for the combinatorial principle itself. Our results confirm a phenomenon already observed by Beck and others: the upper bounds for the game version of a combinatorial principle are drastically smaller than the upper bounds for the principle itself. In the case of Paris-Harrington games the difference is qualitatively very striking. For example, the bounds for the game on $3$-uniform hypergraphs are a fixed stack of exponentials while the bounds for the corresponding combinatorial principle are known to be at least Ackermannian. For higher dimensions, the combinatorial Paris-Harrington numbers are known to be cofinal in the Schwichtenberg-Wainer Hiearchy of fast-growing functions up to the ordinal $\varepsilon_0$, while we show that the game Paris-Harrington numbers are bounded by fixed stacks of exponentials.

A note about $k$-DNF resolution (2018).
Massimo Lauria.
In: Information Processing Letters. 137(TBD):33--39.
Show Abstract
Abstract

In this note we show two results about $k$-DNF resolution. First we prove that resolution extended with parities of size $k$ is strictly weaker than $k$-DNF resolution. Then we show the connection between tree-like $k$-DNF resolution and narrow dag-like resolution. This latter result is clearly implicit in [Krajícek 1994] but this direct proof is focused on resolution and provides information about refutation width.

On semantic cutting planes proofs with very small coefficients (2018).
Massimo Lauria and Neil Thapen.
In: Information Processing Letters. 136(TBD):70--75.
Show Abstract
Abstract

Cutting planes proofs for integer programs can naturally be defined both in a syntactic and in a semantic fashion. Filmus et al. (STACS 2016) proved that semantic cutting planes proofs may be exponentially stronger than syntactic ones, even if they use the semantic rule only once. We show that when semantic cutting planes proofs are restricted to have coefficients bounded by a function growing slowly enough, syntactic cutting planes can simulate them efficiently. Furthermore if we strengthen the restriction to a constant bound, then the simulating syntactic proof even has polynomially small coefficients.

Cliques enumeration and tree-like resolution proofs (2018).
Massimo Lauria.
In: Information Processing Letters. 135(TDB):62--67.
Show Abstract
Abstract

We show the close connection between the enumeration of cliques in a $k$-clique free graph $G$ and the length of tree-like resolution refutations for formula $\mathrm{Clique}(G,k)$, which claims that $G$ has a $k$-clique. The length of any such tree-like refutation is within a "fixed parameter tractable" factor from the number of cliques in the graph. We then proceed to drastically simplify the proofs of the lower bounds for the length of tree-like resolution refutations of $\mathrm{Clique}(G,k)$ shown in [Beyersdorff et at. 2013, Lauria et al. 2017], which now reduce to a simple estimate of the number of cliques.

Tight Size-Degree Bounds for Sums-of-Squares Proofs (2017).
Massimo Lauria and Jakob Nordström.
In: Computationa Complexity. 26(4):911--948.
Conference version published at CCC, 2015.
Show Abstract
Abstract

We exhibit families of $4$-CNF formulas over $n$ variables that have sums-of-squares (SOS) proofs of unsatisfiability of degree (a.k.a. rank) $d$ but require SOS proofs of size $n^{\Omega(d)}$ for values of $d = d(n)$ from constant all the way up to $n^{\delta}$ for some universal constant$\delta$. This shows that the $n^{O(d)}$ running time obtained by using the Lasserre semidefinite programming relaxations to find degree-$d$ SOS proofs is optimal up to constant factors in the exponent. We establish this result by combining $\mathsf{NP}$-reductions expressible as low-degree SOS derivations with the idea of relativizing CNF formulas in [Kraj́íček '04] and [Dantchev and Riis'03], and then applying a restriction argument as in [Atserias, Müller, and Oliva '13] and [Atserias, Lauria, and Nordström '14]. This yields a generic method of amplifying SOS degree lower bounds to size lower bounds, and also generalizes the approach in [ALN14] to obtain size lower bounds for the proof systems resolution, polynomial calculus, and Sherali-Adams from lower bounds on width, degree, and rank, respectively.

A Rank Lower Bound for Cutting Planes Proofs of Ramsey's Theorem (2016).
Massimo Lauria.
In: ACM Trans. Comput. Theory. 8(4):17:1--17:13.
Conference version published at SAT, 2013.
Show Abstract
Abstract

Ramsey's Theorem is a cornerstone of combinatorics and logic. In its simplest formulation it says that for every $k > 0$ and $s > 0$, there is a minimum number $r(k, s)$ such that any simple graph with at least $r(k, s)$ vertices contains either a clique of size $k$ or an independent set of size $s$. We study the complexity of proving upper bounds for the number $r(k, k)$. In particular, we focus on the propositional proof system cutting planes; we show that any cutting plane proof of the upper bound “$r(k, k) \leq 4k$” requires high rank. In order to do that we show a protection lemma which could be of independent interest.

Narrow Proofs May Be Maximally Long (2016).
Albert Atserias, Massimo Lauria and Jakob Nordström.
In: ACM Trans. Comput. Logic. 17(3):19:1--19:30.
Conference version published at CCC, 2014.
Show Abstract
Abstract
We prove that there are $3$-CNF formulas over $n$ variables refutable in resolution in width $w$ that require resolution proofs of size $n^{\Omega(w)}$. This shows that the simple counting argument that any formula refutable in width $w$ must have a proof in size $n^{O(w)}$ is essentially tight. Moreover, our lower bound extends even to polynomial calculus resolution (PCR) and Sherali-Adams, implying that the corresponding size upper bounds in terms of degree and rank are tight as well. In contrast, the formulas have Lasserre proofs of constant rank and size polynomial in both $n$ and $w$.
On the Proof Complexity of Paris-Harrington and Off-Diagonal Ramsey Tautologies (2016).
Lorenzo Carlucci, Nicola Galesi and Massimo Lauria.
In: ACM Trans. Comput. Logic. 17(4):26:1--26:25.
Part of this work appeared as a conference paper at CCC, 2011.
Show Abstract
Abstract

We study the proof complexity of Paris-Harrington’s Large Ramsey Theorem for bi-colorings of graphs and of off-diagonal Ramsey’s Theorem. For Paris-Harrington we prove a non-trivial conditional lower bound in Resolution and a non-trivial upper bound in bounded-depth Frege. The lower bound is conditional on a (very reasonable) hardness assumption for a weak (quasi-polynomial) Pigeonhole principle in $\mathsf{Res}(2)$. We show that under such assumption, there is no refutation of the Paris-Harrington formulas of size quasi-polynomial in the number of propositional variables. The proof technique for the lower bound extends the idea of using a combinatorial principle to blow-up a counterexample for another combinatorial principle beyond the threshold of inconsistency. A strong link with the proof complexity of an unbalanced off-diagonal Ramsey principle is established. This is obtained by adapting some constructions due to Erdös and Mills. We prove a non-trivial Resolution lower bound for a family of such off-diagonal Ramsey principles.

The Complexity of Proving that a Graph is Ramsey (2017).
Massimo Lauria, Pavel Pudlák, Vojtěch Rödl and Neil Thapen.
In: Combinatorica. 37(2):253--268.
Conference version published at ICALP, 2013.
Show Abstract
Abstract

We say that a graph with $n$ vertices is $c$-Ramsey if it does not contain either a clique or an independent set of size $c \log n$. We define a CNF formula which expresses this property for a graph $G$. We show a superpolynomial lower bound on the length of resolution proofs that $G$ is $c$-Ramsey, for every graph $G$. Our proof makes use of the fact that every Ramsey graph must contain a large subgraph with some of the statistical properties of the random graph.

Space Complexity in Polynomial Calculus (2015).
Yuval Filmus, Massimo Lauria, Jakob Nordström, Neil Thapen and Noga Ron-Zewi.
In: SIAM Journal on Computing. 44(4):1119--1153.
Conference version published at CCC, 2012.
Show Abstract
Abstract

During the last decade, an active line of research in proof complexity has been to study space complexity and time-space trade-offs for proofs. Besides being a natural complexity measure of intrinsic interest, space is also an important issue in SAT solving, and so research has mostly focused on weak systems that are used by SAT solvers.

There has been a relatively long sequence of papers on space in resolution and resolution-based proof systems, and it is probably fair to say that resolution is reasonably well understood from this point of view. For other natural candidates to study, however, such as polynomial calculus or cutting planes, very little has been known. We are not aware of any nontrivial space lower bounds for cutting planes, and for polynomial calculus the only lower bound has been for CNF formulas of unbounded width in [Alekhnovich et al.'02], where the space lower bound is smaller than the initial width of the clauses in the formulas. Thus, in particular, it has been consistent with current knowledge that polynomial calculus could be able to refute any $k$-CNF formula in constant space.

In this paper, we prove several new results on space in polynomial calculus (PC), and in the extended proof system polynomial calculus resolution (PCR) studied in [Alekhnovich et al.'02]:

  1. We prove an $\Omega(n)$ space lower bound in PC for the canonical $3$-CNF version of the pigeonhole principle formulas $\mathsf{PHP}_n^m$ with $m$ pigeons and $n$ holes, and show that this is tight.
  2. For PCR, we prove an $\Omega(n)$ space lower bound for a bitwise encoding of the functional pigeonhole principle with $m$ pigeons and $n$ holes. These formulas have width $O(\log n)$, and so this is an exponential improvement over [Alekhnovich et al.'02] measured in the width of the formulas.
  3. We then present another encoding of a version of the pigeonhole principle that has constant width, and prove an $\Omega(n)$ space lower bound in PCR for these formulas as well.
  4. Finally, we prove that any $k$-CNF formula can be refuted in PC in simultaneous exponential size and linear space (which holds for resolution and thus for PCR, but was not obviously the case for PC). We also characterize a natural class of CNF formulas for which the space complexity in resolution and PCR does not change when the formula is transformed into a $3$-CNF in the canonical way, something that we believe can be useful when proving PCR space lower bounds for other well-studied formula families in proof complexity.

From Small Space to Small Width in Resolution (2015).
Yuval Filmus, Massimo Lauria, Mladen Miksa, Jakob Nordström and Marc Vinyals.
In: ACM Transactions on Computational Logic. 16(4):28:1--28:15.
Conference version published at STACS, 2014.
Show Abstract
Abstract
In 2003, Atserias and Dalmau resolved a major open question about the resolution proof system by establishing that the space complexity of formulas is always an upper bound on the width needed to refute them. Their proof is beautiful but somewhat mysterious in that it relies heavily on tools from finite model theory. We give an alternative, completely elementary, proof that works by simple syntactic manipulations of resolution refutations. As a by-product, we develop a “black-box” technique for proving space lower bounds via a “static” complexity measure that works against any resolution refutation—previous techniques have been inherently adaptive. We conclude by showing that the related question for polynomial calculus (i.e., whether space is an upper bound on degree) seems unlikely to be resolvable by similar methods.
Parameterized Complexity of DPLL Search Procedures (2013).
Olaf Beyersdorff, Nicola Galesi and Massimo Lauria.
In: ACM Transactions on Computational Logic. 14(3):20.
Conference version appeared at SAT, 2011.
Show Abstract
Abstract

We study the performance of DPLL algorithms on parameterized problems. In particular, we investigate how difficult it is to decide whether small solutions exist for satisfiability and combinatorial problems. For this purpose we develop a Prover-Delayer game which models the running time of DPLL procedures and we establish an information-theoretic method to obtain lower bounds to the running time of parameterized DPLL procedures. We illustrate this technique by showing lower bounds to the parameterized pigeonhole principle and to the ordering principle. As our main application we study the DPLL procedure for the problem of deciding whether a graph has a small clique. We show that proving the absence of a $k$-clique requires $n^{\Omega(k)}$ steps for a non-trivial distribution of graphs close to the critical threshold. For the restricted case of tree-like Parameterized Resolution, this result answers a question asked in Beyersdorff et al. (2012) of understanding the Resolution complexity of this family of formulas.

A Characterization of Tree-Like Resolution Size (2013).
Olaf Beyersdorff, Nicola Galesi and Massimo Lauria.
In: Information Processing Letters. 113(18):666--671.
Show Abstract
Abstract

We explain an asymmetric Prover-Delayer game which precisely characterizes proof size in tree-like Resolution. This game was previously described in a parameterized complexity context to show lower bounds for parameterized formulas and for the classical pigeonhole principle. The main point of this note is to show that the asymmetric game in fact characterizes tree-like Resolution proof size, i.e. in principle our proof method allows to always achieve the optimal lower bounds. This is in contrast with previous techniques described in the literature. We also provide a very intuitive information-theoretic interpretation of the game.

Parameterized Bounded-Depth Frege Is not Optimal (2012).
Olaf Beyersdorff, Nicola Galesi, Massimo Lauria and Alexander Razborov.
In: ACM Trans. Comput. Theory. 4(3):7:1--7:16.
Conference version appeared at ICALP, 2011
Selected in Computing Reviews' Notable Books and Articles 2012 list.
Show Abstract
Abstract

A general framework for parameterized proof complexity was introduced by Dantchev, Martin, and Szeider [FOCS, 2007]. In that framework the parameterized version of any proof system is not fpt-bounded for some technical reasons, but we remark that this question becomes much more interesting if we restrict ourselves to those parameterized contradictions $(F,k)$ in which $F$ itself is a contradiction. We call such parameterized contradictions strong, and with one important exception (vertex cover) all interesting contradictions we are aware of are strong. It follows from the gap complexity theorem of Dantchev, Martin and Szeider [FOCS, 2007] that tree-like Parameterized Resolution is not fpt-bounded w.r.t. strong parameterized contradictions.

The main result of this paper significantly improves upon this by showing that even the parameterized version of bounded-depth Frege is not fpt-bounded w.r.t. strong contradictions. More precisely, we prove that the pigeonhole principle requires proofs of size $n^{\Omega(k)}$ in bounded-depth Frege, and, as a special case, in dag-like Parameterized Resolution. This answers an open question posed in Dantchev, Martin and Szeider [FOCS, 2007].

In the opposite direction, we interpret a well-known $\mathsf{FPT}$ algorithm for vertex cover as a DPLL procedure for Parameterized Resolution. Its generalization leads to a proof search algorithm for Parameterized Resolution that in particular shows that tree-like Parameterized Resolution allows short refutations of all parameterized contradictions given as bounded-width CNF's.

A Lower Bound for the Pigeonhole Principle in Tree-like Resolution by Asymmetric Prover-Delayer Games (2010).
Olaf Beyersdorff, Nicola Galesi and Massimo Lauria.
In: Information Processing Letters. 110(23):1074-1077.
Show Abstract
Abstract
In this note we show that the asymmetric Prover-Delayer game developed in Beyersdorff, Galesi, Lauria [2010a] for Parameterized Resolution is also applicable to other tree-like proof systems. In particular, we use this asymmetric Prover-Delayer game to show a lower bound of the form $2^{\Omega(n\log n)}$ for the pigeonhole principle in tree-like Resolution. This gives a new and simpler proof of the same lower bound established by Iwama and Miyazaki [1999] and Dantchev and Riis [2001].
Optimality of size-degree tradeoffs for polynomial calculus (2010).
Nicola Galesi and Massimo Lauria.
In: ACM Transaction on Computational Logic. 12(1):4:1--4:22.
Show Abstract
Abstract

There are methods to turn short refutations in Polynomial Calculus ($\mathsf{Pc}$) and Polynomial Calculus with Resolution ($\mathsf{Pcr}$) into refutations of low degree. Bonet and Galesi [Bonet and Galesi 1999, 2003] asked if such size-degree trade-offs for $\mathsf{Pc}$ [Clegg et al. 1996; Impagliazzo et al. 1999] and $\mathsf{Pcr}$ [Alekhnovich et al. 2004] are optimal.

We answer this question by showing a polynomial encoding of Graph Ordering Principle on $m$ variables which requires $\mathsf{Pc}$ and $\mathsf{Pcr}$ refutations of degree $\Omega(\sqrt{m})$. Trade-offs optimality follows from our result and from the short refutations of Graph Ordering Principle in [Bonet and Galesi 1999, 2001].

We then introduce the algebraic proof system $\mathsf{Pcr}_{k}$ which combines together Polynomial Calculus ($\mathsf{Pc}$) and $k$-DNF Resolution ($\mathsf{Res}_{k}$). We show a size hierarchy theorem for $\mathsf{Pcr}_{k}$: $\mathsf{Pcr}_{k}$ is exponentially separated from $\mathsf{Pcr}_{k+1}$. This follows from the previous degree lower bound and from techniques developed for $\mathsf{Res}_{k}$.

Finally we show that random formulas in conjunctive normal form ($3$-CNF) are hard to refute in $\mathsf{Pcr}_{k}$.

On the Automatizability of Polynomial Calculus (2010).
Nicola Galesi and Massimo Lauria.
In: Theory of Computing Systems. 47(2):491--506.
Errata: Typos in Lemma 4 statement and proof. Fixed only in the local version (update May 10, 2022)..
Show Abstract
Abstract

We prove that Polynomial Calculus and Polynomial Calculus with Resolution are not automatizable, unless $W[P]$-hard problems are fixed parameter tractable by one-side error randomized algorithms. This extends to Polynomial Calculus the analogous result obtained for Resolution by Alekhnovich and Razborov (SIAM J. Computing, 38(4), 2008).

Minimum-Energy Broadcast and disk cover in grid wireless networks (2008).
Tiziana Calamoneri, Andrea E. F. Clementi, Miriam Di Ianni, Massimo Lauria, Angelo Monti and Riccardo Silvestri.
In: Theoretical Computer Science. 399(1-2):38--53.
Conference version appeared at SIROCCO, 2006.
Show Abstract
Abstract

The Minimum-Energy Broadcast problem is to assign a transmission range to every station of an ad hoc wireless networks so that (i) a given source station is allowed to perform broadcast operations and (ii) the overall energy consumption of the range assignment is minimized.

We prove a nearly tight asymptotical bound on the optimal cost for the Minimum-Energy Broadcast problem on square grids. We also derive near-tight bounds for the Bounded-Hop version of this problem. Our results imply that the best-known heuristic, the MST-based one, for the Minimum-Energy Broadcast problem is far to achieve optimal solutions (even) on very regular, well-spread instances: its worst-case approximation ratio is about $\pi$ and it yields $\Omega(\sqrt{n})$ source hops, where $n$ is the number of stations.

As a by product, we get nearly tight bounds for the Minimum-Disk Cover problem and for its restriction in which the allowed disks must have non-constant radius.

Finally, we emphasize that our upper bounds are obtained via polynomial time constructions.

On the bounded-hop MST problem on random Euclidean instances (2007).
Andrea E. F. Clementi, Miriam Di Ianni, Massimo Lauria, Angelo Monti, Gianluca Rossi and Riccardo Silvestri.
In: Theoretical Computer Science. 384(2-3):161-167.
Conference version appeared at SIROCCO 2005 with the name "Divide and Conquer Is Almost Optimal for the Bounded-Hop MST Problem on Random Euclidean Instances".
Show Abstract
Abstract

The $d$-Dim $h$-hops MST problem is defined as follows: Given a set $S$ of points in the $d$- dimensional Euclidean space and $s \in S$, find a minimum-cost spanning tree for $S$ rooted at $s$ with height at most $h$. We investigate the problem for any constants $h$ and $d > 0$. We prove the first non trivial lower bound on the solution cost for almost all Euclidean instances (i.e. the lower-bound holds with hight probability). Then we introduce an easy-to-implement, very fast divide et impera heuristic and we prove that its solution cost matches the lower bound.

Conference

MaxSAT Resolution with Inclusion Redundancy (2024).
Bonacina, Ilario, Bonet, Maria Luisa and Lauria, Massimo.
In: 27th International Conference on Theory and Applications of Satisfiability Testing (SAT 2024). pp. 7:1--7:15.
Show Abstract
Abstract

Popular redundancy rules for SAT are not necessarily sound for MaxSAT. The works of [Bonacina-Bonet-Buss-Lauria'24] and [Ihalainen-Berg-Järvisalo'22] proposed ways to adapt them, but required specific encodings and more sophisticated checks during proof verification. Here, we propose a different way to adapt redundancy rules from SAT to MaxSAT. Our rules do not require specific encodings, their correctness is simpler to check, but they are slightly less expressive. However, the proposed redundancy rules, when added to MaxSAT-Resolution, are already strong enough to capture Branch-and-bound algorithms, enable short proofs of the optimal cost of notable principles (e.g., the Pigeonhole Principle and the Parity Principle), and allow to break simple symmetries (e.g., XOR-ification does not make formulas harder).

The Power of Negative Reasoning (2021).
Susanna F. de Rezende, Massimo Lauria, Jakob Nordström and Dmitry Sokolov.
In: 36th Computational Complexity Conference (CCC 2021). pp. 40:1--40:24.
Show Abstract
Abstract

Semialgebraic proof systems have been studied extensively in proof complexity since the late 1990s to understand the power of Gröbner basis computations, linear and semidefinite programming hierarchies, and other methods. Such proof systems are defined alternately with only the original variables of the problem and with special formal variables for positive and negative literals, but there seems to have been no study how these different definitions affect the power of the proof systems. We show for Nullstellensatz, polynomial calculus, Sherali-Adams, and sums-of-squares that adding formal variables for negative literals makes the proof systems exponentially stronger, with respect to the number of terms in the proofs. These separations are witnessed by CNF formulas that are easy for resolution, which establishes that polynomial calculus, Sherali-Adams, and sums-of-squares cannot efficiently simulate resolution without having access to variables for negative literals.

Algorithm Analysis Through Proof Complexity (2018).
Massimo Lauria.
In: 14th Conference on Computability in Europe (CiE 2018). pp. 254--263.
Graph Colouring is Hard for Algorithms Based on Hilbert's Nullstellensatz and Gröbner Bases (2017).
Massimo Lauria and Jakob Nordström.
In: 32nd Computational Complexity Conference (CCC 2017). pp. 2:1--2:20.
Show Abstract
Abstract

We consider the natural encoding of the $k$-colouring problem for a given graph as a set of polynomial equations. We prove that there are bounded-degree graphs that do not have legal $k$-colourings but for which the polynomial calculus proof system defined in [Clegg et al '96, Alekhnovich et al '02] requires linear degree, and hence exponential size, to establish this fact. This implies a linear degree lower bound for any algorithms based on Gröbner bases solving $k$-colouring problems using this encoding. The same bound applies also for the algorithm studied in a sequence of papers [De Loera et al '08, '09, '11, '15] based on Hilbert's Nullstellensatz proofs for a slightly different encoding, thus resolving an open problem mentioned, e.g., in [De Loera et al '09] and [Li et al '16].
We obtain our results by combining the polynomial calculus degree lower bound for functional pigeonhole principle (FPHP) formulas over bounded-degree bipartite graphs in [Mikša and Nordström '15] with a reduction from FPHP to $k$-colouring derivable by polynomial calculus in constant degree.

CNFgen: A generator of crafted benchmarks (2017).
Massimo Lauria, Jan Elffers and Jakob Nordström, Marc Vinyals.
In: Theory and Applications of Satisfiability Testing - SAT 2017. pp. 464--473.
Show Abstract
Abstract

We present CNFgen, a generator of combinatorial benchmarks in DIMACS and OPB format. The proof complexity literature is a rich source not only of hard instances but also of instances that are theoretically easy but “extremal” in different ways, and therefore of potential interest in the context of SAT solving. Since most of these formulas appear not to be very well known in the SAT community, however, we propose CNFgen as a resource to make them readily available for solver development and evaluation. Many formulas studied in proof complexity are based on graphs, and CNFgen is also able to generate, parse and do basic manipulation of such objects. Furthermore, it includes a library cnfformula giving access to the functionality of CNFgen to Python programs.

Trade-offs Between Time and Memory in a Tighter Model of CDCL SAT Solvers (2016).
Jan Elffers, Jan Johannsen, Massimo Lauria, Thomas Magnard, Jakob Nordström and Marc Vinyals.
In: Theory and Applications of Satisfiability Testing - SAT 2016 - 19th International Conference, Bordeaux, France, July 5-8, 2016, Proceedings. pp. 160--176.
Show Abstract
Abstract

A long line of research has studied the power of conflict-driven clause learning (CDCL) and how it compares to the resolution proof system in which it searches for proofs. It has been shown that CDCL can polynomially simulate resolution, even when the learning scheme is adversarially chosen as long as it is asserting. However, the simulation only works under the assumption that no learned clauses are ever forgotten, and the polynomial blow-up is significant. Moreover, the simulation requires very frequent restarts, whereas the power of CDCL with less frequent or entirely without restarts remains poorly understood. With a view towards obtaining results with tighter relations between CDCL and resolution, we introduce a more fine-grained model of CDCL that captures not only time but also memory usage and number of restarts. We show how previously established strong size-space trade-offs for resolution can be transformed into equally strong trade-offs between time and memory usage for CDCL, where the upper bounds hold for CDCL without any restarts using the standard 1UIP clause learning scheme, and the (in some cases tightly matching) lower bounds hold for arbitrarily frequent restarts and arbitrary clause learning schemes.

Semantic Versus Syntactic Cutting Planes (2016).
Yuval Filmus, Pavel Hrubes and Massimo Lauria.
In: 33rd Symposium on Theoretical Aspects of Computer Science (STACS 2016). pp. 35:1--35:13.
Show Abstract
Abstract

In this paper, we compare the strength of the semantic and syntactic version of the cutting planes proof system.
First, we show that the lower bound technique of Pudlak applies also to semantic cutting planes: the proof system has feasible interpolation via monotone real circuits, which gives an exponential lower bound on lengths of semantic cutting planes refutations.
Second, we show that semantic refutations are stronger than syntactic ones. In particular, we give a formula for which any refutation in syntactic cutting planes requires exponential length, while there is a polynomial length refutation in semantic cutting planes. In other words, syntactic cutting planes does not p-simulate semantic cutting planes. We also give two incompatible integer inequalities which require exponential length refutation in syntactic cutting planes.
Finally, we pose the following problem, which arises in connection with semantic inference of arity larger than two: can every multivariate non-decreasing real function be expressed as a composition of non-decreasing real functions in two variables?

Hardness of Approximation in PSPACE and Separation Results for Pebble Games (2015).
Siu Man Chan, Massimo Lauria, Jakob Nordström and Marc Vinyals.
In: Proc. of the 56th IEEE Symposium on Foundations of Computer Science (FOCS), 2015. pp. 466--485.
Show Abstract
Abstract

We consider the pebble game on DAGs with bounded fan-in introduced in [Paterson and Hewitt '70] and the reversible version of this game in [Bennett '89], and study the question of how hard it is to decide exactly or approximately the number of pebbles needed for a given DAG in these games.

We prove that the problem of deciding whether s pebbles suffice to reversibly pebble a DAG G is PSPACE-complete, as was previously shown for the standard pebble game in [Gilbert, Lengauer and Tarjan '80]. Via two different graph product constructions we then strengthen these results to establish that both standard and reversible pebbling space are PSPACE-hard to approximate to within any additive constant. To the best of our knowledge, these are the first hardness of approximation results for pebble games in an unrestricted setting (even for polynomial time). Also, since [Chan '13] proved that reversible pebbling is equivalent to the games in [Dymond and Tompa '85] and [Raz and McKenzie '99], our results apply to the Dymond--Tompa and Raz--McKenzie games as well, and from the same paper it follows that resolution depth is PSPACE-hard to determine up to any additive constant.

We also obtain a multiplicative logarithmic separation between reversible and standard pebbling space. This improves on the additive logarithmic separation previously known and could plausibly be tight, although we are not able to prove this.

We leave as an interesting open problem whether our additive hardness of approximation result could be strengthened to a multiplicative bound if the computational resources are decreased from polynomial space to the more common setting of polynomial time.

Towards an Understanding of Polynomial Calculus: New Separations and Lower Bounds (Extended Abstract) (2013).
Yuval Filmus, Massimo Lauria, Mladen Mikša, Jakob Nordström and Marc Vinyals.
In: Proc. of the 40th International Colloquium on Automata, Languages and Programming, ICALP 2013. pp. 437-448.
Show Abstract
Abstract

During the last decade, an active line of research in proof complexity has been into the space complexity of proofs and how space is related to other measures. By now these aspects of resolution are fairly well understood, but many open problems remain for the related but stronger polynomial calculus (PC/PCR) proof system. For instance, the space complexity of many standard “benchmark formulas” is still open, as well as the relation of space to size and degree in PC/PCR.
We prove that if a formula requires large resolution width, then making XOR substitution yields a formula requiring large PCR space, providing some circumstantial evidence that degree might be a lower bound for space. More importantly, this immediately yields formulas that are very hard for space but very easy for size, exhibiting a size-space separation similar to what is known for resolution. Using related ideas, we show that if a graph has good expansion and in addition its edge set can be partitioned into short cycles, then the Tseitin formula over this graph requires large PCR space. In particular, Tseitin formulas over random $4$-regular graphs almost surely require space at least $\Omega(\sqrt{n})$.
Our proofs use techniques recently introduced in [Bonacina-Galesi~'13]. Our final contribution, however, is to show that these techniques provably cannot yield non-constant space lower bounds for the functional pigeonhole principle, delineating the limitations of this framework and suggesting that we are still far from characterizing PC/PCR space.

A Distributed Protocol for the Bounded-Hops Converge-Cast in Ad-Hoc Networks (2006).
Andrea E. F. Clementi, Miriam Di Ianni, Massimo Lauria, Angelo Monti, Gianluca Rossi and Riccardo Silvestri.
In: Ad-Hoc, Mobile, and Wireless Networks, 5th International Conference, ADHOC-NOW 2006, Ottawa, Canada, August 17-19, 2006, Proceedings. pp. 60-72.
Show Abstract
Abstract

Given a set $S$ of points (stations) located in the $d$-dim. Euclidean space and a root $b \in S$, the $h$-$\mathsf{hops Convergecast}$ problem asks to find for a minimal energy-cost range assignment which allows to perform the converge-cast primitive (i.e. node accumulation) towards $b$ in at most $h$ hops. For this problem no polynomial time algorithm is known even for $h = 2$.

The main goal of this work is the design of an efficient distributed heuristic (i.e. protocol) and the analysis (both theoretical and experimental) of its expected solution cost. In particular, we introduce an efficient parameterized randomized protocol for $h$-$\mathsf{hops Convergecast}$ and we analyze it on random instances created by placing $n$ points uniformly at random in a $d$-cube of side length $L$. We prove that for $h = 2$, its expected approximation ratio is bounded by some constant factor. Finally, for $h = 3,..., 8$, we provide a wide experimental study showing that our protocol has very good performances when compared with previously introduced (centralized) heuristics.

Thesis

Ph.D. thesis
Degree lower bounds for Algebraic Proof System (2008).
Massimo Lauria.
Sapienza - Universita` di Roma.
Show Abstract
Abstract

Given a propositional formula $\phi$ we are interested in studying how difficult is to prove that $\phi$ is a tautology. Multivariate polynomials can be manipulated in a specific way to obtain such a proof. We study the power of this algebraic manipulation. In particular its running time is deeply related with the degree of the polynomials involved. We prove that this method behaves very badly in some cases, by showing that there are tautologies with very short proofs but which require high degree polynomials.

This does not exclude that another method of polynomial manipulation could lead to an efficient algorithm. In this case we are able to prove that if such efficient algorithm exists, it could be used to solve problems believed to be very hard.

Algebraic proof systems discussed so far are limited. We define a slightly stronger system $\mathsf{Pcr}_{k}$. We give examples of tautologies that are easy for this system, but very hard for systems known in literature. Nevertheless we show that a random tautology is still very hard to prove for $\mathsf{Pcr}_{k}$.