Etienne Grandjean (GREYC, Caen)

Accurately measuring the complexity of algorithms and establishing the intrinsic computational complexity of problems are among the most important tasks/goals in computer science. For this, everyone agrees that the right model of computation (for sequential algorithms) is the random access machine (RAM). Contrary to this general agreement, paradoxically, there is no consensus on how to define the RAM model. It is like a Babel tower. Since the 1970s, the literature has presented a plethora of definitions of RAMs and their complexity with no unifying result.

In this talk, I will present a simple model of RAM. I will show that this model on the one hand sticks to the algorithms, their data structures and their fine-grained complexity (linear time, or even constant time, etc.) and on the other hand is robust to many variations. In particular, we demonstrate that the complexity classes — linear time, quadratic time, enumeration complexity classes, etc. — do not change depending on whether the only primitive operation of a RAM is addition, or whether the usual arithmetic operations — subtraction, multiplication or even division (!!!!) and many others — are allowed. This is a joint work with Louis Jachiet.