- Zur globalen robusten Optimierung
Beckers, Markus; Naumann, Uwe (Thesis advisor)
Aachen : Publikationsserver der RWTH Aachen University (2014)
Dissertation / PhD Thesis
Aachen, Techn. Hochsch., Diss., 2014
Abstract
Optimization problems in engineering often have nonconvex objectives and constraints and require global optimization algorithms. Furthermore the achieved global optimum is supposed to be insensitive against variations of the influencing parameters, i.e. they have to be robust. We tackle this problem by a combination of three scientific areas, i.e. Nonsmooth Ananlysis of McCormick Relaxations and Uncertainty Quantification using Algorithmic Differentiation. McCormick relaxations are certain convex and concave under- and overestimators of the objective function and constraints. Since convex functions are possibly nonsmooth, subgradients are used instead of usual derivatives. Subgradients are natural extensions of usual derivatives providing similar information for nondifferentiable functions. AD like methods can be used to propagate the values of McCormick Relaxations as well as their subgradients. A first version implementing the tangent-linear mode of AD was developed by colleagues at MIT's Process Systems Engineering Laboratory and Department. This work extended this approach by an adjoint mode which provides the same advantages as usual adjoints. The subgradients, computed either in forward- or reverse-mode, are used in the context of an deterministic branch-and-bound algorithm for global optimization. Furthermore the equivalence of these methods to classical is proven. Uncertainty Quantification aims to determine the imprecision in the outputs of numerical programs caused by (measurement) errors in the inputs. For a known error distribution of the inputs, probabilistic methods are used to get information about the distribution of the outputs. Our approach is based on a Taylor Series Expansion of the function implemented by the simulation, yielding approximations of the mean and variance of the distribution of the output. AD allows the efficient computation of higher order derivatives and therewith more precise approximations. The application of this method allows to transform the given optimization problem into a robust one, also considering uncertainties in certain parameters. This adapted problem is accordingly solved by the above mentioned branch-and-bound algorithm for global optimization delivering the desired robust minimum.
Institutions
- Department of Computer Science [120000]
- Software and Tools for Computational Engineering Group [123120]