Embedded Systems December 2000 Vol13_13

Issue link:

Contents of this Issue


Page 19 of 197

PROGRAMMER'S TOOLBOX • In general, another thing that makes most algorithms so inscrutable is the desire to minimize storage, especially global variables, and data motion. In ~his month's column, you'll see some radical departures in style. it all practical. But it's perfect for our uses. So, th is month, in addition to seeing yet another attempt to develop a good, reliable, general-purpose func- tion minimizer, you're also going to see a program architecture that solves a lot of problems in a lot of dilierent areas. I've often used nested state machines in real-time systems; the switch statement that implements them doesn't cost much overhead, and neither does the single integer value returned. My experience has been that nested state machines can turn even the most hairy and convo- luted exercise in logic and decision making into a clean, fast, and perfect- ly understandable solution. He re's hoping it wi ll do the same in this case. In general, another thing that makes most algorithm so inscrutable is the desire to minimize storage, especially global vat;ables, and data motion. In this month's column, you' ll see some radical departures in style. At this point, I'm interested in keeping the count of function evaluations low-that's the part that is potentially expensive. But I don't really care if I'm u ing one more memory location than I need to, or if I'm moving data around too much. The perfonnance impact of such is ues is typically small compared to the cost of a n.mction evaluation, and I can save mak- ing the algorithm optimally efficient (and therefore suitably inscrutable) for another day. Random thoughts Thi month, I'm also going to concen- trate on some nagging thoughts that have been haunting me [or some time. My general feelings are: • We' re not u ing the data from pre- vious step a effectively as we could • We need more robu t criteria for deciding when to quit Relative to both points, I've noted before that Brent's method compares only previous values of the predicted x-value of the minimum, not the y- value. We've also seen that two appli- cations of the quadratic fit can some- times predi ct exactly tlle same x-value, even though the y-values are radically different (and both value are nowhere close to the minimum). It seems to me that we are not squeezing enough information out of tlle avail- able data. One way to do so would be to track the y-values as well as »values. Also, consider the process involved in two golden-section bisections. We start with three points, which define two intervals, and bisect (evenly or based on the golden section) one of the intervals. Then, based upon tlle results of the first bisection, we throw away one of those points, and bisect again . After all is said and done, we have two new points, giving us a total of five. Yet only three-the lowest point, and its two ne ighbors- are passed on to the quadratic fit. Can we get more information by using all five points? I think so. In past issues, I've suggested that if we consid- er four points at a time instead of three, we can perform two parabolic fits, each using the two endpoin ts plus one of the two interior points. The golden section generates that fourth point by bisecting one of the two inter- vals. But if we retain the last five points close to the minimum, we can fit two parabolas, each of which passes through the current best point. That approach must surely be superior to tlle one previously suggested. I also noted last time that Brent's method uses a relative, rather than 18 DECEMBER 2000 Embedded Systems Programming absolute, error criterion in x. On this point, I'm absolutely adamant. I don' t care how popular the method is, this choice is simply No Good, because x may be zero, and dividing by zero IS not a good idea. Implementations of Brent's algo- rithm all use additive deltas to pre- vent divide-by-zero exceptions. In my not-so-humble Opll1lOn, approach is a band-aid that hides a fundamental problem. I'll have more to say about this in a moment. If you were paying careful atte n- tion, you may have noticed that I pre- viously wrote that we begin the process using two golden section steps, bisecting once on the left and once on the right. But then the thought occurred to me: why do we do this first bisection when we begin the solution seq uence with three points cho en to fit the condition: (1) These poin ts are perfectly good points for at least attempting a qua- dratic fit. So why bother bisecting? In retrospect, we shouldn 't have to do so unless the solution given by the fit is somehow unsatisfactory (for exam- ple, the value X,nin is identical to XI). I must admit to you that I'm still in the process of acljusting my attitude for this one. I've been a suming for so long that we begin with the golden section that I'm going to have to adapt to this new notion. Most of what I present later in this article sti ll assumes the golden section as a starter. Nevertheless, I believe that in the long term we don ' t need to do it this way. One last thought: in my last col- umn, I said that I'd forego Brent's algoritllm for a time, to give you guys time to u)' out the method I present- ed. But y'alllet me dOWll. Though I've gotten a lot of e-mail concerning my column (like Mike Mullen 's incredible offer of a copy of Brent) , not one of you seems to have actual ly tried the th is

Articles in this issue

Archives of this issue

view archives of EETimes - Embedded Systems December 2000 Vol13_13