How to constrain all the constants to be positive #449
-
Hi Miles, I am a huge fan of PYSR. Currently, I am using PYSR to find mathematic models for human tissues, it works very well. Small question: is it possible for us to constrain all the predicted constants to be positive? For example, in f(x)=aexp(bx)+cln(1-dx), a,b,c,d should be strictly positive. Thanks |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 3 replies
-
You could do this with a custom objective function, like: function eval_loss(tree, dataset::Dataset{T,L}, options)::L where {T,L}
# See https://astroautomata.com/SymbolicRegression.jl/dev/types/#DynamicExpressions.EquationModule.Node
is_negative_constant(node) = node.degree == 0 && node.constant && node.val::T < 0
# (The ::T part is not required, but it just speeds it up as then Julia knows it isn't `nothing`)
# Will walk through tree and count number of times this function is true
num_negative_constants = count(is_negative_constant, tree)
# (Tree utilities are defined here: https://github.com/SymbolicML/DynamicExpressions.jl/blob/master/src/base.jl,
# and let you treat an expression like a Julia collection)
if num_negative_constants > 0
# Return 1000 times the number of negative constants as a regularization penalty
return L(1000 * num_negative_constants)
end
prediction, flag = eval_tree_array(tree, dataset.X, options)
if !flag
return L(Inf)
end
return sum((prediction .- dataset.y) .^ 2) / dataset.n
end It's better to have a scaling penalty like this so that you give it a sense of direction (many negative constants are worse than one negative constant).
You made my day :) Cheers, |
Beta Was this translation helpful? Give feedback.
You could do this with a custom objective function, like: