Given two ring families C and D on a finite ground set V, with both em
pty set and V is an element of C and D, consider the family of so-call
ed intersections L = {L subset of or equal to V\L = C boolean AND D, C
is an element of C, D is an element of D and C boolean OR D = V} and
let A be the incidence matrix of L. The minimum partitioning problem:
''Given a vector d is an element of Z(+)(V), minimize y1 s.t. yA = d,
y greater than or equal to 0, y integer'', is solved by a longest path
computation. The approach is polyhedral and capitalizes on previous r
esults related to lattice matrices.