One main reason traditional algorithms theory is not adequate in many modern applications is that computation is viewed as a simple process of transforming given input data into a desired output using a well-defined and simple machine model consisting of a processor and an (infinite
sized) memory. This scenario is not realistic in modern applications where computation is increasingly being performed on massive amounts of data (larger than the main memory size) and on very diverse computation devices.
The center center focuses on three (different but also very related) core research areas that address some of the inadequacies of traditional theory:
I/O-efficient algorithms
For efficient processing of massive datasets that reside
on slow (external) mass storage devices.
cache-oblivious algorithms
Forefficient processing of large datasets on devices with complicated (possibly even unknown) memory hierarchies.
streaming algorithms
For efficient processing of data that is so massive that reading
through it more than ones is infeasible, or for processing data
that naturally arrive continually
in a streaming way.
The center also has focus on a fourth and more practical core area:
Algorithm engineering
That consists of the design, analysis and implementation of - and experimentation with - practical algorithms.
These three first areas are all novel and relatively young research areas within the broader algorithms area, formed in response to some of the inadequacies of traditional theory. The fourth is a natural part of the center exactly because a main center motivation is the inadequacy of traditional algorithms theory in providing practically efficient algorithms, but also because engineering work naturally supports multidisciplinary and industry collaboration.
Apart from the core research areas, center researchers also consider a number of additional algorithms areas in relation to massive data processing, such as for example algorithms for flash memory and for modern multi-core processors (parallel private-cache), and algorithms that can handle (unknown) failures of some memory locations (fault-tolerant model). Efficient data structures, including very space efficient data structures (succinct data structures), are also being considered. In general, additional focus is on designing simple theoretical models that captures the essential features of modern computing devices.
The centers annual reports outlines research progress in the above areas.
MADALGO - Center for Massive Data Algorithmics, a Center of the Danish National Research Foundation / Department of Computer Science / Aarhus University