Inverse binomial sampling for efficient log-likelihood estimation of simulator models

lacerbi lacerbi Last update: Aug 08, 2022

Inverse binomial sampling (IBS)

What is it

IBS is a technique to obtain unbiased, efficient estimates of the log-likelihood of a model by simulation. [1]

The typical scenario is the case in which you have a simulator, that is a model from which you can randomly draw synthetic observations (for a given parameter vector), but cannot evaluate the log-likelihood analytically or numerically. In other words, IBS affords likelihood-based inference for models without explicit likelihood functions (also known as implicit models).

IBS is commonly used as a part of an algorithm for maximum-likelihood estimation or Bayesian inference.

Code

This repository stores basic and advanced implementations and example usages of IBS in various programming languages for users of the method. At the moment, we have a MATLAB implementation and we plan to include other ones (e.g., Python).

The code used to produce results in the paper [1] is available in the development repository here.

For practical recommendations and any other question, check out the FAQ on the IBS wiki.

References

  1. van Opheusden*, B., Acerbi*, L. & Ma, W.J. (2020). Unbiased and efficient log-likelihood estimation with inverse binomial sampling. PLoS Computational Biology 16(12): e1008483. (* equal contribution) (link)

You can cite IBS in your work with something along the lines of

We estimated the log-likelihood using inverse binomial sampling (IBS; van Opheusden, Acerbi & Ma, 2020), a technique that produces unbiased and efficient estimates of the log-likelihood via simulation.

If you use IBS in conjunction with Bayesian Adaptive Direct Search, as recommended in the paper, you could add

We obtained maximum-likelihood estimates of the model parameters via Bayesian Adaptive Direct Search (BADS; Acerbi & Ma, 2017), a hybrid Bayesian optimization algorithm which affords stochastic objective evaluations.

and cite the appropriate paper:

  1. Acerbi, L. & Ma, W. J. (2017). Practical Bayesian optimization for model fitting with Bayesian Adaptive Direct Search. In Advances in Neural Information Processing Systems 30:1834-1844.

Similarly, if you use IBS in combination with Variational Bayesian Monte Carlo, you should cite these papers:

  1. Acerbi, L. (2018). Variational Bayesian Monte Carlo. In Advances in Neural Information Processing Systems 31: 8222-8232.
  2. Acerbi, L. (2020). Variational Bayesian Monte Carlo with Noisy Likelihoods. In Advances in Neural Information Processing Systems 33: 8211-8222.

Besides formal citations, you can demonstrate your appreciation for our work in the following ways:

  • Star the IBS repository on GitHub;
  • Follow us on Twitter (Luigi, Bas) for updates about IBS and other projects we are involved with;
  • Tell us about your model-fitting problem and your experience with IBS (positive or negative) at [email protected] or [email protected] (putting 'IBS' in the subject of the email).

License

The IBS code is released under the terms of the MIT License.

PRAGMA foreign_keys = off; BEGIN TRANSACTION; COMMIT TRANSACTION; PRAGMA foreign_keys = on;

Subscribe to our newsletter