Finding a c-optimal design of a regression model is a basic optimization problem in statistics. We study the computational complexity of the problem in the case of a finite experimental domain.
We formulate a decision version of the problem and prove its NP-completeness. We provide examples of computationally complex instances of the design problem, motivated by cryptography.
The problem, being NP-complete, is then relaxed; we prove that a decision version of the relaxation, called approximate c-optimality, is P--complete, is then relaxed; we prove that a decision version of the relaxation, called approximate c-optimality, is P-complete. We derive an equivalence theorem for linear programming: we show that the relaxed c-optimality is equivalent (in the sense of many-one LOGSPACE-reducibility) to general linear programming.