Symbolic Transformer #789
-
Hi, Newbie question that I can't find reference in documentation. Is it possible to generate new features instead of predicting similar to GPlearn Symbolic Transfomer? |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 5 replies
-
You could write down an arbitrary forward function (https://ai.damtp.cam.ac.uk/pysr/examples/#11-expression-specifications) or loss function (https://ai.damtp.cam.ac.uk/pysr/examples/#9-custom-objectives) and could set up a problem where you optimize an unsupervised objective (you could write down a fake value for |
Beta Was this translation helpful? Give feedback.
-
Here’s a Spearman Correlation loss for PySR in case this is what you were after? loss_function = """
function my_spearman_loss(tree, dataset::Dataset{T,L}, options) where {T,L}
prediction, complete = eval_tree_array(tree, dataset.X, options)
if !complete
return L(Inf)
end
# Get ranks
rank_pred = sortperm(sortperm(prediction))
rank_y = sortperm(sortperm(dataset.y))
# Calculate Spearman
d = rank_pred .- rank_y
rho = 1 - (6 * sum(d .^ 2)) / (dataset.n * (dataset.n^2 - 1))
# Return 1-rho so it's a loss to minimize (rho ranges from -1 to 1)
return L(1 - rho)
end
"""
model = PySRRegressor(loss_function=loss_function)
model.fit(X, y)
features = [model.predict(X, index=i) for i in range(len(model.equations_))] the |
Beta Was this translation helpful? Give feedback.
Here’s a Spearman Correlation loss for PySR in case this is what you were after?