Speaker
Description
In reliability-based topology optimization or topology optimization under uncertainty, an accurate evaluation of the probabilistic model requires the system to be simulated for a large number of varying parameters. Traditional gradient-based optimization schemes thus face the difficulty that reasonable accuracy and numerical efficiency often seem mutually exclusive. We propose a stochastic optimization technique to tackle this problem. To be precise, we combine the well-known method of moving asymptotes (MMA) with a stochastic sample-based integration strategy. By adaptively recombining gradient information from previous steps, we obtain a noisy gradient estimator that is asymptotically correct, i.e., the approximation error vanishes over the course of iterations. As a consequence, the resulting stochastic method of moving asymptotes (sMMA) allows us to solve chance constraint topology optimization problems for a fraction of the cost compared to traditional approaches from literature.