Polynomially improved efficiency for fast parallel single-source lexicographic depth-first search, breadth-first search, and topological-first search

Citation
P. De La Torre et Cp. Kruskal, Polynomially improved efficiency for fast parallel single-source lexicographic depth-first search, breadth-first search, and topological-first search, THEOR C SYS, 34(4), 2001, pp. 275-298
Citations number
35
Categorie Soggetti
Computer Science & Engineering
Journal title
THEORY OF COMPUTING SYSTEMS
ISSN journal
14324350 → ACNP
Volume
34
Issue
4
Year of publication
2001
Pages
275 - 298
Database
ISI
SICI code
1432-4350(200107/08)34:4<275:PIEFFP>2.0.ZU;2-H
Abstract
Although lexicographic (lex) variants of greedy algorithms are often P-comp lete, NC-algorithms are known for the following lex-search problems: lexico graphic dept-first search (lex-dfs) for dags [12] [17], and lexicographic b readth-first search (lex-tfs) for dags [12]. For the all-sources version of the problem for dense digraphs, the lex-dfs (lex-bfs, lex-tfs) in [12] is (within a log factor of ) work-optimal with respect to the all-sources sequ ential solution that performs a dfs (bfs, tfs) from every vertex. By contra st, to solve the single-source lexicorgraphic version on inputs of size n, all known NC-algorithms perform work that is at least an n factor away from the work performed by their sequential counterparts. We presented parallel algorithms that solve the single-source version of th ese lex-search problems in O(log(2)n) time using M(n) processors on the ERE W PRAM. (M(n) denotes the number of processors required to multiply two n x n integer matrices in O(log n) time and has O(n(2.376)) as tightest curren tly known as bound.) They all offer a polynomial improvement in work effici ency over that of their corresponding best previously known and close the g ap between the requirements of the best known parallel algorithms for the l ex and nonlex versions of the problems. Key to the efficiency of these algorithms is the novel idea of a lex-splitt ing tree and lex-conquer subgraphs of a dag G from source s. These structur es provide a divide and conquer skeleton from which NC-algorithms for sever al lexicographic search problems emerge, in particular, an algorithm that p laces in the class NC the lex-dfs for reducible flow graphs-an interesting class of graphs which arise naturally in connection with code optimization and data flow analysis [4], [19]. A notable aspect of these algorithms is t hat they solve the lex-search problem instance at hand by efficiently trans forming solutions of appropriate instances of (nonlex) path problems. This renders them potentially capable of transferring significant algorithmic ad vances-such as Driscoll et al.'s [14] single-source shortest paths algorith m and Ullman and Yannakakis' [34] transitive closure algorithm-from fundame ntal (nonlex) path problems to lex-search problems.