How to speed up custom loss function? #480
Replies: 1 comment 1 reply
-
I would try to benchmark it with BenchmarkTools.jl (https://github.com/JuliaCI/BenchmarkTools.jl), in pure Julia, with SymbolicRegression.jl. You can do this with, for example: using SymbolicRegression
using BenchmarkTools
# For generating random trees:
using SymbolicRegression.MutationFunctionsModule: gen_random_tree_fixed_size
#= define objective function =#
# Set up dataset and options
options = Options(binary_operators=[+, *, -, /], unary_operators=[cos, exp])
nrows = 23
nfeatures = 5
T = Float64
dataset = Dataset(
randn(T, nfeatures, nrows),
randn(T, nrows)
)
# Benchmark:
@benchmark(
my_custom_objective(tree, $dataset, $options),
setup=(
tree_size = 30;
tree = gen_random_tree_fixed_size(tree_size, options, nfeatures, T)
)
) This will give you a sense of how fast it is. For this dataset size and tree size, the baseline median performance for the default objective is around 1.5 microseconds on my machine: julia> function default_objective(tree, dataset::Dataset{T,L}, options) where {T,L}
pred, completed = eval_tree_array(tree, dataset.X, options)
!completed && return L(Inf)
return sum(i -> (pred[i] - dataset.y[i])^2, eachindex(pred))
end
my_custom_objective (generic function with 1 method)
julia> # Benchmark:
@benchmark(
default_objective(tree, $dataset, $options),
setup=(
tree_size = 30;
tree = gen_random_tree_fixed_size(tree_size, options, nfeatures, T)
)
)
BenchmarkTools.Trial: 10000 samples with 262 evaluations.
Range (min … max): 267.813 ns … 18.438 μs ┊ GC (min … max): 0.00% … 90.47%
Time (median): 1.507 μs ┊ GC (median): 0.00%
Time (mean ± σ): 1.632 μs ± 1.382 μs ┊ GC (mean ± σ): 7.27% ± 7.91%
▁▂▄▅▅▆█▇▆▇▆▆▄▃▂▁
▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▂▂▂▃▄▅▆▇█████████████████▇▆▅▄▄▃▃▂▂▂▂▂▁▁▁ ▄
268 ns Histogram: frequency by time 2.31 μs <
Memory estimate: 960 bytes, allocs estimate: 4. So if you are way larger than this (which is what it sounds like) that there is likely some bottleneck in the code. You can continue to use Then my second comment is you should check out the page https://docs.julialang.org/en/v1/manual/performance-tips/ for common performance "gotchas" in Julia. In particular it seems like some of your code does not declare global variables as @variables x y z with (you can see what const (x, y, z) = @variables x y z The Julia discourse is also super helpful for optimizing code snippets: https://discourse.julialang.org/. Cheers, |
Beta Was this translation helpful? Give feedback.
-
I have custom loss function ( https://gist.github.com/DenisSvirin/64952e83496968cfce9d117328670c04 ).
(edit - pasted below)
I was following tips on your website, but still I had only 4 iterations in 22 hours. Is there way to speed it up?
Here is my model:
I have 23 training points and each cycle shouldn't be over 200 iterations.
Beta Was this translation helpful? Give feedback.
All reactions