Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Investigate other gradient-free optimisation algorithms #284

Closed
ben18785 opened this issue Mar 24, 2018 · 3 comments
Closed

Investigate other gradient-free optimisation algorithms #284

ben18785 opened this issue Mar 24, 2018 · 3 comments
Labels

Comments

@ben18785
Copy link
Collaborator

Seeing as there is unlikely to be one sampling mode which does best, there is also unlikely to be one (gradient free or otherwise) maximisation mode that fits best. As such, it may be interesting to try and implement some of the following, especially seeing as there will be much that can be borrowed from the MCMC stuff. They also tend to be simpler than sampling algorithms (I think/hope),

  • Nelder-Mead (as far as I can tell has no sampling equivalent),
  • Differential evolution (differential evolution, DREAM and emcee hammer on the sampling side),
  • Simulated annealing (SMC and population MCMC on the sampling side),
  • Random search.

I also wonder if we could try and make our own optimisation variants based on multinest/nested sampling for Bayesian problems seeing as this algorithm appears to be particularly good at finding modes.

@MichaelClerx
Copy link
Member

Nelder-mead is a good thing to have for local optimisation. Scipy has a version we could wrap, but would be nicer to have something we can use ask-and-tell on. Will investigate!

@MichaelClerx
Copy link
Member

See also #55 and #54

@MichaelClerx
Copy link
Member

Closed as duplicate of #684

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants