-
Notifications
You must be signed in to change notification settings - Fork 33
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Investigate other gradient-free optimisation algorithms #284
Labels
Comments
Nelder-mead is a good thing to have for local optimisation. Scipy has a version we could wrap, but would be nicer to have something we can use ask-and-tell on. Will investigate! |
Closed as duplicate of #684 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Seeing as there is unlikely to be one sampling mode which does best, there is also unlikely to be one (gradient free or otherwise) maximisation mode that fits best. As such, it may be interesting to try and implement some of the following, especially seeing as there will be much that can be borrowed from the MCMC stuff. They also tend to be simpler than sampling algorithms (I think/hope),
I also wonder if we could try and make our own optimisation variants based on multinest/nested sampling for Bayesian problems seeing as this algorithm appears to be particularly good at finding modes.
The text was updated successfully, but these errors were encountered: