Free Python optimization framework

Tuesday, October 28, 2008

Major changes for ralg

I have committed some major changes for NLP/NSP solver ralg.

A couple of ideas have been stolen from SolvOpt (made by our dept worker Alex Kuntsevich during his voyage to Austria, in collaboration with Franz Kappel), that still remains to be most famous Naum Z. Shor r-algorithm implementation with Fortran90, C and MATLAB API.

However, the implementation is more heuristic than original Naum Z. Shor r-algorithm; since convergence of latter hasn't been proved yet even for smooth convex functions, proving convergence for SolvOpt's implementation will hardly be ever possible (especially because of those heuristics like backward line search approach, see SolvOpt's paper for algorithm details).

So my dept chief is not fond of SolvOpt heuristics (he consider they reduce % of problems that can be solved, because SolvOpt can stop further from optimum, then primal Naum Z. Shor r-algorithm), still SolvOpt remains rather popular in our optimization dept.

Currently I have taken 2 ideas from SolvOpt: modifications for initial step size and step multiplier for forward line search. As for backward search, I found SolvOpt's approach very unclear and currently my own way still remains in OO ralg implementation. Handling of constraints is also performed in a different way.

Also, some code cleanup for OO kernel have been committed.

No comments: