

Optimization of multiple parameters with many local minima
source link: https://www.codesd.com/item/optimization-of-multiple-parameters-with-many-local-minima.html
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

Optimization of multiple parameters with many local minima
I'm looking for algorithms to find a "best" set of parameter values. The function in question has a lot of local minima and changes very quickly. To make matters even worse, testing a set of parameters is very slow - on the order of 1 minute - and I can't compute the gradient directly.
Are there any well-known algorithms for this kind of optimization?
I've had moderate success with just trying random values. I'm wondering if I can improve the performance by making the random parameter chooser have a lower chance of picking parameters close to ones that had produced bad results in the past. Is there a name for this approach so that I can search for specific advice?
More info:
- Parameters are continuous
- There are on the order of 5-10 parameters. Certainly not more than 10.
How many parameters are there -- eg, how many dimensions in the search space? Are they continuous or discrete - eg, real numbers, or integers, or just a few possible values?
Approaches that I've seen used for these kind of problems have a similar overall structure - take a large number of sample points, and adjust them all towards regions that have "good" answers somehow. Since you have a lot of points, their relative differences serve as a makeshift gradient.
- Simulated Annealing: The classic approach. Take a bunch of points, probabalistically move some to a neighbouring point chosen at at random depending on how much better it is.
- Particle Swarm Optimization: Take a "swarm" of particles with velocities in the search space, probabalistically randomly move a particle; if it's an improvement, let the whole swarm know.
- Genetic Algorithms: This is a little different. Rather than using the neighbours information like above, you take the best results each time and "cross-breed" them hoping to get the best characteristics of each.
The wikipedia links have pseudocode for the first two; GA methods have so much variety that it's hard to list just one algorithm, but you can follow links from there. Note that there are implementations for all of the above out there that you can use or take as a starting point.
Note that all of these -- and really any approach to this large-dimensional search algorithm - are heuristics, which mean they have parameters which have to be tuned to your particular problem. Which can be tedious.
By the way, the fact that the function evaluation is so expensive can be made to work for you a bit; since all the above methods involve lots of independant function evaluations, that piece of the algorithm can be trivially parallelized with OpenMP or something similar to make use of as many cores as you have on your machine.
Recommend
-
31
For optimum performance, a PostgreSQL database depends on the operating system parameters being defined correctly. Poorly configured OS kernel...
-
26
In the post https://statcompute.wordpress.com/2017/09/03/variable-selection-with-elastic-net , it was shown how to optimize hyp...
-
28
I recently added an interface to my fork of QuickTheories: @FunctionalInterface public interface QuadFunction<A, B, C, D, E> { E apply(A a, B b, C c, D d); } and this made me wonder how man...
-
17
A brief yet descriptive guide to Gradient Descent, ADAM, ADAGRAD, RMSProp, NADAM
-
13
Add Google Analytics to Jekyll Minima Theme Simplified May 3, 2020 So you want to get some statistics about your blog, but you don’t want to install any plugins because you are too lazy and afra...
-
11
Add Disqus to Jekyll Minima Theme Simplified Mar 28, 2020 Some of the Jekyll theme lack of comment system, as a result, we have to add plugin to fulfill this task, and
-
9
Local variables are different from parameters in C++ coroutines Raymond October 13th, 2021 In C++, you generally think...
-
6
2021-10-26 14:21 去中心化协议Minima完成650万美元A轮融资,GSR等参投 据Cryptodaily消息,去中心化协议Minima完成650万美元A轮融资,GSR、DEX Ventures、AGE Crypto、SMO Capital、Vinny Lingham...
-
10
Minima Closes Series A Round Raising $6.5M to Build ‘The Most Decentralized Network’ October 26, 2021 ZUG, Switzerland, 26th October, 2021,
-
7
Minima Accelerates Towards Mainnet, With More Than Half of Bitcoin’s Full Node Count November 15, 2021 ZUG, Switzerland, 15th November, 2021, —
About Joyk
Aggregate valuable and interesting links.
Joyk means Joy of geeK