-
I am testing embedding_net, e.g. in the example embeddinglatent_dim = 10 there are hyper parameter such as num_hiddens, num_layers and so on. I assume they should be tuned? is there any guideline? For the project I am working on, I used a CNN model, the dense layer's number of neurons is decreasing in order to create a hierarchical structure, and the performance is better than using SBI. But I am thinking SBI is designed to inference problem so should be better? Any comments and suggestions are appreciated. |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Hi @timktsang yes, these hyper parameter should be adapted to the given problem. First, the choice of network is essential, e.g., for image data you should use an CNN embedding, for time series a RNN or a transformer, for other high-dimensional data (say >100), one could use just a fully connected I am not sure I understand how you used your CNN model as an alternative to SBI, but if it is working well, you could just use that CNN as embedding net for the inference with SBI. Regarding hyper parameter searchers, you could set apart a test set of nltp = - torch.mean(torch.tensor([posterior.log_prob(theta_i, x=x_i) for theta_i, x_i in zip(theta, x)])) The "best fitting" I hope this helps. Let me know if there are further questions. Best, |
Beta Was this translation helpful? Give feedback.
Hi @timktsang
yes, these hyper parameter should be adapted to the given problem.
First, the choice of network is essential, e.g., for image data you should use an CNN embedding, for time series a RNN or a transformer, for other high-dimensional data (say >100), one could use just a fully connected
FCEmbedding
. The permutation invariant embedding is useful for trial-based data.I am not sure I understand how you used your CNN model as an alternative to SBI, but if it is working well, you could just use that CNN as embedding net for the inference with SBI.
Regarding hyper parameter searchers, you could set apart a test set of$N$ simulations
(theta, x)
and then calculate the negative log pr…