Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

ASOC: An Adaptive Parameter-free Stochastic Optimization Techinique for Continuous Variables

About

Stochastic optimization is an important task in many optimization problems where the tasks are not expressible as convex optimization problems. In the case of non-convex optimization problems, various different stochastic algorithms like simulated annealing, evolutionary algorithms, and tabu search are available. Most of these algorithms require user-defined parameters specific to the problem in order to find out the optimal solution. Moreover, in many situations, iterative fine-tunings are required for the user-defined parameters, and therefore these algorithms cannot adapt if the search space and the optima changes over time. In this paper we propose an \underline{a}daptive parameter-free \underline{s}tochastic \underline{o}ptimization technique for \underline{c}ontinuous random variables called ASOC.

Jayanta Basak• 2015

Related benchmarks

TaskDatasetResultRank
Mathematical OptimizationBeale function
Functional Minima0.00e+0
8
Mathematical OptimizationBooth function
Functional Minima4.00e-4
8
Mathematical OptimizationMatyas function
Functional Minima5.00e-5
8
Mathematical OptimizationEggholder function
Functional Minima-959.6
8
Mathematical OptimizationSchaffer N. 2 function
Functional Minima5.00e-4
8
Mathematical OptimizationSchaffer N. 4 function
Schaffer N. 4 Minima0.5
8
Mathematical OptimizationAckley function
Functional Minima0.008
8
Mathematical OptimizationThree-hump camel function
Functional Minima4.00e-6
8
Mathematical OptimizationGoldsteinPrice function
Functional Minima3.0022
8
Mathematical OptimizationMcCormick function
Functional Minima-1.9132
8
Showing 10 of 18 rows

Other info

Follow for update