Embedded Systems December 2000 Vol13_13

Issue link:

Contents of this Issue


Page 29 of 197

PROGRAMMER'S TOOLBOX • The loosest requirement on the function to be minimized is that it be continu- ous. If it is not-if its value jumps around between values of x that are arbi- trarily close-all bets are off, and we have nothing left to work with but random probi~gs. The functions fix) and g(x), howev- er, also have continous first derivatives (in fact, all their derivatives are con- tinuous) . Thal is why their shape in the vicinily of the minimum must look like a parabola. It should go without saying thal an On occasion, I've also used a variation: (17) TABLE 1 irhe test functions ' (1) (2) (3) (4) (5) (6) (7) f (x) = COS(21rX 3) g(x ) = sin(21rx 3) h(x) = ~17TX - 11- 1 a(x) = Ix - 0.7771 L(x) = x R(x) =-x K(x) = 1 I use this variation because g(x) has a maximum before its minimum. As we will see in a moment, il'S important thal our algorithm work even when the function increases for a time before decreasing-that's why g(x) has value for us. h should go without saying that one or two functions are not enough lO prove lhe algorithm. As I gel ever closer to the completed algorithm, I want to be quite sure that it works for functions that mighl seem, in some way, to be "pathological." The loosest requirement o n the function to be minimized is that it be continuous. If it is not- if its value jumps around between values of x that are arbitrarily close-all bet are off, and we have nothing left to work with but random probings. If we are to have any hope o[ searching for a solution in the usual sense of lhe term, we must require that the func- tion be conlinuous, algorithm that assumes a function shaped like a parabola works very well when the [unction is a parabola. In facl, such a method should converge right away, and if it doesn't, we need to fall back and regroup. On the other hand, many functions in nature don 't have con tinuous first derivatives, and therefore don't look like parabola, no matter what power of zoom lens we apply. An obvious example is: j(X) = H (18) Any decent general-purpose algo- rithm should find the minimum al x = O. Not surprisingly, Brent's method, being based o n successive quadratic fits , works well when the function looks like a parabola, and pretty badly with a function such as the one given in Equation 18, It ends up falling back on successive bisections, not the para- bolic fit. For the remainder of this series, LlSnNG 1 II Local minimizer; exits at encounter of first minimum double minimize(double (*f)(double), double xO, double x1, long n){ static double xmin = xO; static double ymin = f(xO); double dx = (x1 - xO)/n; double y; xO += dx; y = f(xO); while«xO < x1) && (y <= ymi n»{ ymin = y; xmin = xO; xO += dx; y = f(xO); } return xmi n; } I'm adopting an even more stringent function: h(x) = ~17TX - 11 - I (19) The square root function causes the shape to be the opposite ofa parabola, in a sense, The closer one gets to the minimum, the steeper the slope of the function, and the more a sharp edge forms at the minimum. Does thi mean that we don't need the simple absolute value? Not at all. This function is unique in that it has segments that are traight lines on either side of the minimum. A mart minimizer should recognize this fact, and exploil it. Brent's method doe 'n 't. Fortunately, we are in position to achieve fast convergence for this case, 28 DECEMBER 2000 Embedded Systems Programming

Articles in this issue

Archives of this issue

view archives of EETimes - Embedded Systems December 2000 Vol13_13