- Posteriors saved under
sbi
v0.17.2
or older can not be loaded undersbi
v0.18.0
or newer. sample_with
can no longer be passed to.sample()
. Instead, the user has to rerun.build_posterior(sample_with=...)
. (#573)- the
posterior
no longer has the the method.sample_conditional()
. Using this feature now requires using thesampler interface
(see tutorial here) (#573) retrain_from_scratch_each_round
is now calledretrain_from_scratch
(#598, thanks to @jnsbck)- API changes that had been introduced in
sbi v0.14.0
andv0.15.0
are not enforced. Using the interface prior to those changes leads to an error (#645) - prior passed to SNPE / SNLE / SNRE must be a PyTorch distribution (#655), see FAQ-7 for how to pass use custom prior.
- new
sampler interface
(#573) - posterior quality assurance with simulation-based calibration (SBC) (#501)
- added
Sequential Neural Variational Inference (SNVI)
(Glöckler et al. 2022) (#609, thanks to @manuelgloeckler) - bugfix for SNPE-C with mixture density networks (#573)
- bugfix for sampling-importance resampling (SIR) as
init_strategy
for MCMC (#646) - new density estimator for neural likelihood estimation with mixed data types (MNLE, #638)
- MCMC can now be parallelized across CPUs (#648)
- improved device check to remove several GPU issues (#610, thanks to @LouisRouillard)
- pairplot takes
ax
andfig
(#557) - bugfix for rejection sampling (#561)
- remove warninig when using multiple transforms with NSF in single dimension (#537)
- Sampling-importance-resampling (SIR) is now the default
init_strategy
for MCMC (#605) - change
mp_context
to allow for multi-chain pyro samplers (#608, thanks to @sethaxen) - tutorial on posterior predictive checks (#592, thanks to @LouisRouillard)
- add FAQ entry for using a custom prior (#595, thanks to @jnsbck)
- add methods to plot tensorboard data (#593, thanks to @lappalainenj)
- add option to pass the support for custom priors (#602)
- plotting method for 1D marginals (#600, thanks to @guymoss)
- fix GPU issues for
conditional_pairplot
andActiveSubspace
(#613) - MCMC can be performed in unconstrained space also when using a
MultipleIndependent
distribution as prior (#619) - added z-scoring option for structured data (#597, thanks to @rdgao)
- refactor c2st; change its default classifier to random forest (#503, thanks to @psteinb)
- MCMC
init_strategy
is now calledproposal
instead ofprior
(#602) - inference objects can be serialized with
pickle
(#617) - preconfigured fully connected embedding net (#644, thanks to @JuliaLinhart #624)
- posterior ensembles (#612, thanks to @jnsbck)
- remove gradients before returning the
posterior
(#631, thanks to @tomMoral) - reduce batchsize of rejection sampling if few samples are left (#631, thanks to @tomMoral)
- tutorial for how to use SBC (#629, thanks to @psteinb)
- tutorial for how to use SBI with trial-based data and mixed data types (#638)
- allow to use a
RestrictedPrior
as prior forSNPE
(#642) - optional pre-configured embedding nets (#568, #644, thanks to @JuliaLinhart)
- bug fix for transforms in KDE (#552)
- improve kwarg handling for rejection abc and smcabc
- typo and link fixes (#549, thanks to @pitmonticone)
- tutorial notebook on crafting summary statistics with sbi (#511, thanks to @ybernaerts)
- small fixes and improved documenentation for device handling (#544, thanks to @milagorecki)
- New API for specifying sampling methods (#487). Old syntax:
posterior = inference.build_posterior(sample_with_mcmc=True)
New syntax:
posterior = inference.build_posterior(sample_with="mcmc") # or "rejection"
- Rejection sampling for likelihood(-ratio)-based posteriors (#487)
- MCMC in unconstrained and z-scored space (#510)
- Prior is now allowed to lie on GPU. The prior has to be on the same device as the one passed for training (#519).
- Rejection-ABC and SMC-ABC now return the accepted particles / parameters by default,
or a KDE fit on those particles (
kde=True
) (#525). - Fast analytical sampling, evaluation and conditioning for
DirectPosterior
trained with MDNs (thanks @jnsbck #458).
scatter
allowed for diagonal entries in pairplot (#510)- Changes to default hyperparameters for
SNPE_A
(thanks @famura, #496, #497) - bugfix for
within_prior
checks (#506)
- Implementation of SNPE-A (thanks @famura and @theogruner, #474, #478, #480, #482)
- Option to do inference over iid observations with SNLE and SNRE (#484, #488)
- Fixed unused argument
num_bins
when usingnsf
as density estimator (#465) - Fixes to adapt to the new support handling in
torch
v1.8.0
(#469) - More scalars for monitoring training progress (thanks @psteinb #471)
- Fixed bug in
minimal.py
(thanks @psteinb, #485) - Depend on
pyknos
v0.14.2
- add option to pass
torch.data.DataLoader
kwargs to all inference methods (thanks @narendramukherjee, #445) - fix bug due to release of
torch
v1.8.0
(#451) - expose
leakage_correction
parameters forlog_prob
correction in unnormalized posteriors (thanks @famura, #454)
- Active subspaces for sensitivity analysis (#394, tutorial)
- Method to compute the maximum-a-posteriori estimate from the posterior (#412)
pairplot()
,conditional_pairplot()
, andconditional_corrcoeff()
should now be imported fromsbi.analysis
instead ofsbi.utils
(#394).- Changed
fig_size
tofigsize
in pairplot (#394). - moved
user_input_checks
tosbi.utils
(#430).
- Depend on new
joblib=1.0.0
and fix progress bar updates for multiprocessing (#421). - Fix for embedding nets with
SNRE
(thanks @adittmann, #425). - Is it now optional to pass a prior distribution when using SNPE (#426).
- Support loading of posteriors saved after
sbi v0.15.0
(#427, thanks @psteinb). - Neural network training can be resumed (#431).
- Allow using NSF to estimate 1D distributions (#438).
- Fix type checks in input checks (thanks @psteinb, #439).
- Bugfix for GPU training with SNRE_A (thanks @glouppe, #442).
- Fixup for conditional correlation matrix (thanks @JBeckUniTb, #404)
- z-score data using only the training data (#411)
- Small fix for SMC-ABC with semi-automatic summary statistics (#402)
- Support for training and sampling on GPU including fixes from
nflows
(#331) - Bug fix for SNPE with neural spline flow and MCMC (#398)
- Small fix for SMC-ABC particles covariance
- Small fix for rejection-classifier (#396)
- New flexible interface API (#378). This is going to be a breaking change for users of the flexible interface and you will have to change your code. Old syntax:
from sbi.inference import SNPE, prepare_for_sbi
simulator, prior = prepare_for_sbi(simulator, prior)
inference = SNPE(simulator, prior)
# Simulate, train, and build posterior.
posterior = inference(num_simulation=1000)
New syntax:
from sbi.inference import SNPE, prepare_for_sbi, simulate_for_sbi
simulator, prior = prepare_for_sbi(simulator, prior)
inference = SNPE(prior)
theta, x = simulate_for_sbi(simulator, proposal=prior, num_simulations=1000)
density_estimator = inference.append_simulations(theta, x).train()
posterior = inference.build_posterior(density_estimator) # MCMC kwargs go here.
More information can be found here here.
- Fixed typo in docs for
infer
(thanks @glouppe, #370) - New
RestrictionEstimator
to learn regions of bad simulation outputs (#390) - Improvements for and new ABC methods (#395)
- Linear regression adjustment as in Beaumont et al. 2002 for both MCABC and SMCABC
- Semi-automatic summary statistics as in Fearnhead & Prangle 2012 for both MCABC and SMCABC
- Small fixes to perturbation kernel covariance estimation in SMCABC.
- Fix bug in SNRE (#363)
- Fix warnings for multi-D x (#361)
- Small improvements to MCMC, verbosity and continuing of chains (#347, #348)
- Make logging of vectorized numpy slice sampler slightly less verbose and address NumPy future warning (#347)
- Allow continuation of MCMC chains (#348)
- Conditional distributions and correlations for analysing the posterior (#321)
- Moved rarely used arguments from pairplot into kwargs (#321)
- Sampling from conditional posterior (#327)
- Allow inference with multi-dimensional x when appropriate embedding is passed (#335)
- Fixes a bug with clamp_and_warn not overriding num_atoms for SNRE and the warning message itself (#338)
- Compatibility with Pyro 1.4.0 (#339)
- Speed up posterior rejection sampling by introducing batch size (#340, #343)
- Allow vectorized evaluation of numpy potentials (#341)
- Adds vectorized version of numpy slice sampler which allows parallel log prob evaluations across all chains (#344)
- Bug fix for zero simulations in later rounds (#318)
- Bug fix for sbi.utils.sbiutils.Standardize; mean and std are now registered in state dict (thanks @plcrodrigues, #325)
- Tutorials on embedding_net and presimulated data (thanks @plcrodrigues, #314, #318)
- FAQ entry for pickling error
- Bug fix for broken NSF (#310, thanks @tvwenger).
- Add FAQ (#293)
- Fix bug in embedding_net when output dimension does not equal input dimension (#299)
- Expose arguments of functions used to build custom networks (#299)
- Implement non-atomic APT (#301)
- Depend on pyknos 0.12 and nflows 0.12
- Improve documentation (#302, #305, thanks to @agramfort)
- Fix bug for 1D uniform priors (#307).
- Fixed pickling of SNRE by moving StandardizeInputs (#291)
- Added check to ensure correct round number when presimulated data is provided
- Subclassed Posterior depending on inference algorithm (#282, #285)
- Pinned pyro to v1.3.1 as a temporary workaround (see #288)
- Detaching weights for MCMC SIR init immediately to save memory (#292)
- Bug fix for log_prob() in SNRE (#280)
- Changed the API to do multi-round inference (#273)
- Allow to continue inference (#273)
- Added missing type imports (#275)
- Made compatible for Python 3.6 (#275)
- Added
mcmc_parameters
to init methods of inference methods (#270) - Fixed detaching of
log_weights
when usingsir
MCMC init (#270) - Fixed logging for SMC-ABC
- Added option to pass external data (#264)
- Added setters for MCMC parameters (#267)
- Added check for
density_estimator
argument (#263) - Fixed
NeuralPosterior
pickling error (#265) - Added code coverage reporting (#269)
- Added ABC methods (#250)
- Added multiple chains for MCMC and new init strategy (#247)
- Added options for z-scoring for all inference methods (#256)
- Simplified swapping out neural networks (#256)
- Improved tutorials
- Fixed device keyword argument (#253)
- Removed need for passing x-shapes (#259)
- First public version