You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I came across following problem, after I had trained my neural densitor estimator plus score supplement (SCANDAL) without any problems, I tried to evaluate the log-likelihood using the trained network with a separate test sample.
However, I always get an NaN inside the MAF code. After introspecting with a forward module hook I could track it down to be inside self.Ws of the ConditionalMaskedAutoregressiveFlow with PDB.
As far as I can tell you, the values in self.Ws are normally produced - no NaNs.
However, after an call to self.to the NaNs appear...
SCANDAL is not necessary for the associated physics project, however, it would be nice to have in comparison.
Cheers,
Simon
The text was updated successfully, but these errors were encountered:
Thank you for reporting this issue, and sorry for this bug!
We have neglected debugging the MAF code somewhat, and in a perfect world we'd totally rewrite it e.g. based on the nflows (https://github.com/bayesiains/nflows) library. Unfortunately, in the near future I don't know if I'll have any time I can spend on MadMiner. Maybe some other team member has time to look at this?
Thanks, @johannbrehmer for looking into it. In that case, I will just keep my fingers off SCANDAL.
A little more information, it already seems to be an issue with the write-out after training (maybe, a from call)? I do not know for sure, but I will take my investigations on.
However, if you plan for a re-implementation, then this issue will be more or less a not-to-be-fixed.
Dear all,
I came across following problem, after I had trained my neural densitor estimator plus score supplement (SCANDAL) without any problems, I tried to evaluate the log-likelihood using the trained network with a separate test sample.
However, I always get an NaN inside the MAF code. After introspecting with a forward module hook I could track it down to be inside
self.Ws
of the ConditionalMaskedAutoregressiveFlow with PDB.As far as I can tell you, the values in self.Ws are normally produced - no NaNs.
However, after an call to self.to the NaNs appear...
SCANDAL is not necessary for the associated physics project, however, it would be nice to have in comparison.
Cheers,
Simon
The text was updated successfully, but these errors were encountered: