Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Loss function result differs from training Loss #30

Open
jklanger opened this issue Mar 29, 2017 · 0 comments
Open

Loss function result differs from training Loss #30

jklanger opened this issue Mar 29, 2017 · 0 comments

Comments

@jklanger
Copy link
Collaborator

jklanger commented Mar 29, 2017

Loss evaluated with a separate function:
let lossFn = mi.Func<single> (loss) |> arg2 input target

differs from the loss evaluated during training:
let trainFn () = Train.train trainable fullDataset trainCfg

When thisoption ist set:
SymTensor.Debug.DisableCombineIntoElementsOptimization <- true
the results are consistent again!

Tested via DeepPrivate:
Z:\DeepPrivate\****Models\bin\Release\****Models.exe Z:\DEVELOP\DeepPrivate\****Models\cfgs\COPYING\LSTM\80reluOutputTest\Config.fsx

Code during training gets optimized differently?

Tested with [band 2277e5e]

@jklanger jklanger changed the title Separate Loss function result differs from training Loss result Loss function result differs from training Loss Mar 29, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant