Design of algorithm studies methods and techniques used to develop efficient algorithms. The design of efficient algorithms is often a critical first step in solving problems in many areas. Depending on the model of computation or computer platform that is required for an application, one might draw on algorithmic research in specific subareas. Several CSE faculty members are involved in algorithmic research.
- Graph Algorithms (He, Ngo)
The research on graph algorithms has been a driving force in the field of design and analysis of algorithms. Many
practical application problems can be modeled by graph problems. With recent developments in computer science
(such as the Internet and networking), many new problems arise and create new research opportunities.
- Parallel Algorithms and Architectures (Miller)
Since the computational power of a single processor is limited, large-scale
problems require multiprocessor machines. The study of parallel algorithms
and architectures is concerned with the design and implementation of
efficient techniques and strategies to solve problems on machines in
which multiple processors are connected.
- Graph Drawing (He)
Graph drawing deals with the problem of constructing two- and three-dimensional visualizations of graphs. A good
visualization of a graph has several important properties, such as few edge-crossings, small area, good aspect-ratio,
and few edge-bends. It has applications in practical areas such as Computer Graphics, VLSI design, Computer Animation
etc. This field focuses on developing efficient algorithms for constructing good visualizations of graphs.
It uses techniques in Graph algorithm and Computational Geometry.
- Computational Geometry (Miller, Xu)
Computational geometry is a discipline concerned with the design, analysis, and implementation of efficient algorithms
for solving problems best described as a set geometric objects such as points, curves, surfaces, and polyhedra. Since
many real world problems can be modeled as geometric objects in certain metric spaces (such as Euclidean spaces),
this area has strong connection with many applied areas such as Medicine, Biology, Networking, Graphics, Robotics,
and VLSI design. Exploiting geometric and combinatorial properties of these problems often leads to better quality
and more efficient solutions.
- Group Testing Algorithms (Ngo)
Group testing dates back to World War II. Algorithms are derived to identify
a set of ďpositivesĒ in a large population of given items, assuming
some mechanism exists that indicates if a pool of items has at least one positive.
Direct applications of group testing algorithms include DNA-library screening,
multiple access control, and mining association rules. Fundamental problems
in group testing relate to mathematical disciplines like combinatorics, extremal
set theory and algebra, making research on group testing very fruitful.
Complexity theory is a mathematical discipline that classifies computational
problems by relative difficulty and measures the computational resources
needed to solve them. It explains why certain problems have no practical
solutions and helps researchers anticipate the difficulties involved in
solving problems of certain types. The classification is quantitative
and investigates both the resources that are necessary to solve a problem
called lower bounds for the problem and the resources currently known
to be sufficient called upper bounds. In general, complexity theory deals
with the quantitative laws of computation and reasoning. For this reason,
complexity theory concerns issues and problems of direct interest to many
other disciplines as well.
Alan Selmanís research is
concerned with properties of complexity classes, with relationships between
classes, and with identification of properties of problems that affect
their computational complexity. He has focused on several areas:
- Average case complexity, which provides mechanisms for classification
of computational problems according to average use of resources, rather
than worst-case usage
- Complexity theoretic underpinnings of public-key cryptography,
including research on promise problems and complexity of classes of partial
- Polynomial-time-bounded reducibilities
- P-selective sets, which are an important tool for studying reducibilities
and function classes
- Self-reducibility, which tends to lower complexity
Ken Reganís research focuses
on the obstacles to proving non-trivial lower bounds in complexity theory.
Motivated by the fact that virtually no super-linear, let alone super-polynomial,
time lower bounds are known for practical problems, part of Reganís
work has developed the less-attended theory of linear-time classes. Regan
has obtained super-linear lower bounds on time or circuit-size for some
problems under certain restrictions. Currently, Regan is pursuing a mathematical
approach to breaking barriers to proving super-polynomial lower bounds.
Reganís work includes:
- Linear time computation and super-linear lower bounds
- Polynomial ideals and algebraic geometry as tools for lower bounds
- Mathematical logic for analyzing provability and characterizing complexity classes
- Complexity-bounded measure theory
- Fixed-parameter complexity theory