Efficient Pre-Processing of Hard Problems: New Approaches, Basic Theory and Applications

Partner Investigator:
Michael A. Langston, Department of Computer Science, University of Tennessee
Abstract:
The main goal of this project is to develop parameterized complexity more fully in both practical and theoretical ways. Specifically we seek to:
(1) Establish improved kernelization bounds by means of powerful data-reduction rules for a number of combinatorial problems of central importance. These outcomes will be based on new techniques building on recent advances by project investigators.
(2) Devise new algorithmic strategies for polynomial-time kernelization. As an example, we have recently established a poly(k) kernelization strategy for the undirected feedback vertex set problem by the novel strategy of employing a constant-factor approximation algorithm to compute initial structure, relative to which effective reduction rules can then be defined and applied.
(3) Develop theoretical machinery to support a mechanised discovery and verification of data-reduction rules . Our approach is closely related to the Myhill-Nerode methods for computing obstruction sets in the context of well-quasi ordered sets as pioneered by Fellows and Langston.
(4) Explore parameterized problem classes that are brought into focus by polynomial-time preprocessing, and develop new techniques for addressing membership issues for these classes. A starting point for this task is the min cut linear arrangement problem.
(5) Investigate systematic ways of lateral algorithmic deployment of parameter-specific extremal structure theory and associated polynmial-time reduction rules.
Participants:
The Chief Investigator for this project is Vladimir Estivill-Castro at Griffith University, Queensland, Australia.
The other Partner Investigator is Michael R. Fellows at The University of Newcastle, New South Wales, Australia.