This section will walk through the steps of implementing a new optimizer module.
The example will show the creation of a new LineSearch which is a special type of local optimizer.
A line search method needs three parameters, the starting point, the direction in the N dimensional space and some sort of step length.
It will optimize the one dimensional function f(x+a*d) where x is the starting point, d is the step direction and a is the parameter to be optimized.
FUNCTION CustomLineSearch ( stepDirection, stepLength, position, value, maxFailedSteps )
START
optPosition := position
optValue := value
fails := 0
DO
position := position + stepDirection*stepLength
position := EnforceBounds(position)
value := f(position)
IF value < optValue THEN
optPosition := position
optValue := value
fails := 0
ELSE
fails := fails + 1
ENDIF
WHILE(fails <= maxFailedSteps)
STOP
The parameters in order are standing for the N dimensional direction vector to move in, the step scaling factor, the starting position, the function value in the starting position and the maximum consecutive failed steps allowed. Firstly the optimum and the failed steps counter is being initialized. The do-while loop checks for the fail count, it always lets one position to be tested. In the loop the position is getting moved then the normalized search space bounds are being enforced. The bound enforce is necessery because the original search space is always normalized into the [-1;1] N dimensional cube and the optimizer can not exceed that boundary. Next we evaluate the function at the new position and check if it is better than the previous optimum. Since the package minimizes the function values higher then optimum value will result in incrementation of the fail counter. If the value is better then the optimum we save the new point and reset the fail counter. If the fail count exceeds the maximum the algorithm stops and the found optimum can be extracted.
The implementation is identical to the pseudo code meaning that they produce the same results, but it implements some helper functions and variables for a better understandable code and to complete abstractions in the module system. The following code segment is the evaluation part and the variables.
/** * Starting vector. */ private Vector x; /** * Possible local minimum place of the kth iteration. */ private Vector xk; /** * Starting local minimum value. */ private double fx; /** * Possible local minimum value of the kth iteration. */ private double fxk; private double stepLength; private long maxFailedSteps; public void run() { if (!isRunnable) { throw new IllegalArgumentException( ErrorMessages.LINESEARCH_NOT_PARAMETERIZED_YET); } long failedSteps = 0; step(); do { if (fxk < fx){ x.setCoordinates(xk.getCoordinates()); fx = fxk; super.sumOfStepLengths+=stepLength; super.success = true; super.maxSuccessfulStepLength = stepLength; failedSteps = 0; }else{ failedSteps++; } if (failedSteps < maxFailedSteps){ break; } step(); } while(true); super.optimum.setCoordinates(x.getCoordinates()); super.optimumValue = fx; }
If you not do have awk installed then install it at "Environment" or run the optimizer the way that it is discussed in the "Using the prebuilt package" section.
To test the new module you can find the sources at the Downloads page, in the custom_module_and_recompile directory of the zip.
After download navigate to the root of the example and build the project with ant build-original.
It will build the release version of the project and copy some config and a run file into the build directory. Next go to the build directory start the run-algorithms script (.bat for Windows, .sh for Linux).
In a couple of minutes the script runs the two optimizer configs on all the BND-s, 10 times each and collect the running information.
When the running finished for an optimizer config BND pair, the results of the 10 runnings are averaged and written to the res.txt.
Of course the newly implemented function gives poor results. The run-algorithms script will delete the result directory, be aware of it!
| bra | eas | shu | sh5 | gpr | |
| GlobalUnirandiCLS | 248.2 | 1059.7 | 418.7 | 1862.5 | 310 |
| GlobalUnirandiCLSCustomLineSearch | 168901 | 97818.7 | 503.3 | 9658.6 | 189944 |
As you can see the new line search found the optima significantly worse than the original version on nearly every function.