An optimization problem is given by its variables, its objective function and constraints. The goal is to assign values to the variables so that the value of the objective function is optimized, and all constraints are satisfied.
At Data Assessment Solutions we model many tasks as optimization problems. Consider, for example, a simple project planning task, where x(i,j,k) is the variable of hours that consultant i spends on the project j on day k. A constraint could be that the sum of these variables over all projects j for consultant i on day k must not exceed ten hours. A possible objective function is to maximize the billable hours of all consultants in a given time period.
Sören Laue and Joachim Giesen from DAS Research have developed an algorithm for solving optimization problems with many constraints on distributed hardware in a computer cluster or many cores of modern CPUs. At first, one would assume that such a distributed algorithm requires three nested loops: An inner loop for solving optimization problems without constraints, a loop to satisfying the constraints, and finally an outer loop that runs on a master node for reaching a consensus of the solutions that are distributed on the different compute nodes. Sören Laue and Joachim Giesen were able to show that two loops can be merged, and thus only two nested loops are sufficient which leads to a significant speedup of the distributed algorithm.
Publication: J. Giesen and S. Laue. Combining ADMM and the Augmented Lagrangian Method for Efficiently Handling Many Constraints. Proceedings of the 28th International Joint Conference on Artificial Intelligence (IJCAI), (2019)