The Covariance Matrix Adaptation Evolutionary Strategy (CMA-ES) is the most sophisticated of the global optimizers, and has relatively fast convergence for a global optimizer. With CMA-ES, the optimizer can “remember” previous iterations, and this history can be exploited to improve the performance of the algorithm while avoiding local optimums.
Suitable for: General optimization, especially for complex problem domains
A powerful local optimizer, which builds a linear model on primary data in a "trust" region around the starting point. The modeled solution will be used as new starting point until it converges to an accurate model of the data. The Trust Region Framework can take advantage of S-parameter sensitivity information to reduce the number of simulations needed and speed up the optimization process. It is the most robust of the optimization algorithms.
Suitable for: General optimization, especially on models with sensitivity information
Using an evolutionary approach to optimization, the Genetic Algorithm generates points in the parameter space and then refines them through multiple generations, with random parameter mutation. By selecting the “fittest” sets of parameters at each generation, the algorithm converges to a global optimum.
Suitable for: Complex problem domains and models with many parameters
Another global optimizer, this algorithm treats points in parameter space as moving particles. At each iteration, the position of the particles changes according to, not only to the best known position of each particle, but also the best position of the entire swarm as well. Particle Swarm Optimization works well for models with many parameters.
Suitable for: Models with many parameters
This method is a local optimization technique which uses multiple points distributed across the parameter space to find the optimum. Nelder Mead Simplex Algorithm is less dependent on the starting point than most local optimizers.
Suitable for: Complex problem domains with relatively few parameters, systems without a good initial model
This is a fast local optimizer which uses interpolation to approximate the gradient of the parameter space. The Interpolated Quasi Newton method has fast convergence.
Suitable for: Computationally demanding models
A simple, robust local optimizer for single-parameter problems. Although slower than the Interpolated Quasi Newton, it can sometimes be more accurate.
Suitable for: Single-variable optimization
A specialized optimizer for printed circuit board (PCB) design, the Decap Optimizer calculates the most effective placement of decoupling capacitors using the Pareto front method. This can be used to minimize either the number of capacitors needed or the total cost while still meeting the specified impedance curve.
Suitable for: PCB layout