Subset sum problem with many divisibility conditions

Let S be a set of natural numbers. We consider S under the divisibility partial order, i.e. s1s2s1s2. Let

α(S)=max an antichain\}.

If we consider the subset sum problem where the multiset of numbers are in S , what can we say about about the complexity of the problem related to \alpha(S)? It is simple to see if \alpha(S)=1, then the problem is easy. Note it is easy even for the harder knapsack problem when \alpha(S)=1\dagger.


\dagger Solving sequential knapsack problems by M. Hartmann and T. Olmstead (1993)

Answer

This problem can be solved in polynomial time using linear programming, and this is actually true for any partial order (S,\le). By the way, we can prove by induction that for any finite partial order set (S,\le), there exists a finite set S’\subseteq\mathbb{N} and a bijection f:S\rightarrow S’, such that for all s_1,s_2\in S, s_1\le s_2 \Leftrightarrow f(s_1) | f(s_2).

Let \mathcal{C} be the set formed by the chains in S. Remind that C is a chain iff for all v,v’ in C, v\le v’ or v’\le v

Now create a boolean variable x_v for each v\in S, and a boolean variable y_C for each chain C. We can write the following linear program (P) for our problem :

\begin{split}
\text{Max} \displaystyle\sum\limits_{v\in S} x_v \\
\text{subject to} \displaystyle\sum\limits_{v\in C} &x_v \le 1, \forall C\in\mathcal{C}\\
&x_v \in \{0,1\}, v\in S
\end{split}

and its dual (D) :


\begin{split}
\text{Min} \displaystyle\sum\limits_{C\in \mathcal{C}} y_C\\
\text{subject to} \displaystyle\sum\limits_{C:v\in C} &y_C \ge 1, \forall v\in S\\
&y_C \in \{0,1\}, C\in \mathcal{C}
\end{split}

Then the problem of finding the minimum cover of an ordered set by chains is the dual of our problem. Dilworth’s theorem states that

There exists an antichain A, and a partition of the order into a family P of chains, such that the number of chains in the partition equals the cardinality of A

which means that the optimal solution of these two problems match : Opt(P)=Opt(D)

Let (P^*) (resp. (D^*)) be the relaxation of (P) (resp. (D)) i.e. the same linear program where all constraints x_v\in\{0,1\} (resp. y_C\in\{0,1\}) are replaced by x_v\in [0,1] (resp. y_C\in [0,1]). Let Opt(P^*) and Opt(D^*) be their optimal solutions. Since \{0,1\}\subseteq [0,1] we have :

Opt(P)\le Opt(P^*) \text{ and } Opt(D^*)\le Opt(D)

and weak duality theorem establishes that Opt(P^*)\le Opt(D^*) then by putting everything together we have :

Opt(P)= Opt(P^*)=Opt(D^*)=Opt(D)

Then, using Ellipsoid method, we can compute Opt(P^*) ( =Opt(P)) in polynomial time. There are an exponential number of constraints but there exists a polynomial time separation oracle. Indeed given a solution X, we can enumerate all couples s_1,s_2\in X and check if s_1\le s_2 or s_2\le s_1, and therefore decide in polynomial time whether X is feasible or otherwise the constraint associated to the chain \{v_1,v_2\} is violated.

Attribution
Source : Link , Question Author : Chao Xu , Answer Author : Mathieu Mari

Leave a Comment