Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Parametrize JuMP model in optimizer type #1348

Closed
wants to merge 8 commits into from
Closed

Parametrize JuMP model in optimizer type #1348

wants to merge 8 commits into from

Conversation

blegat
Copy link
Member

@blegat blegat commented Jun 17, 2018

The optimizer type allows zero overhead on the JuMP side.
The variable type therefore now need to be parametrized on the model type so that its field are concretely typed which induces many changes of this PR.

Benchmarking results

This use the benchmarking file in test/perf/backend_overhead.jl.
Before this change:

v0.6
14.526 ns (0 allocations: 0 bytes)
43.865 ns (5 allocations: 96 bytes)
v0.7
19.497 ns (0 allocations: 0 bytes)
35.113 ns (3 allocations: 64 bytes)

After this change:

v0.6
1.832 ns (0 allocations: 0 bytes)
24.567 ns (3 allocations: 64 bytes)
v0.7
1.824 ns (0 allocations: 0 bytes)
11.284 ns (1 allocation: 32 bytes)

Related to jump-dev/MathOptInterface.jl#321

Copy link
Member

@mlubin mlubin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I will need convincing benchmarks before approving this. The usability loss from extra complexity in the printouts and debugging statements is pretty terrible.

test/print.jl Outdated
@@ -116,10 +116,10 @@ end

v = [x,y,x]
A = [x y; y x]
io_test(REPLMode, v, "JuMP.VariableRef[x, y, x]")
io_test(REPLMode, v, "JuMP.VariableRef{JuMP.Model{MathOptInterface.Utilities.CachingOptimizer{Union{MathOptInterface.AbstractOptimizer, Void},MathOptInterface.Utilities.UniversalFallback{JuMP.JuMPMOIModel{Float64}}}}}[x, y, x]")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

JuMP.VariableRef{JuMP.Model{MathOptInterface.Utilities.CachingOptimizer{Union{MathOptInterface.AbstractOptimizer, Void},MathOptInterface.Utilities.UniversalFallback{JuMP.JuMPMOIModel{Float64}}}}} is not ok for user-facing printouts.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is fixed

@codecov
Copy link

codecov bot commented Jun 17, 2018

Codecov Report

Merging #1348 into master will increase coverage by <.01%.
The diff coverage is 100%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master    #1348      +/-   ##
==========================================
+ Coverage   89.33%   89.34%   +<.01%     
==========================================
  Files          24       24              
  Lines        3386     3368      -18     
==========================================
- Hits         3025     3009      -16     
+ Misses        361      359       -2
Impacted Files Coverage Δ
src/nlp.jl 79.56% <ø> (ø) ⬆️
src/print.jl 79.45% <ø> (ø) ⬆️
src/parseexpr.jl 94.29% <100%> (ø) ⬆️
src/JuMP.jl 74.77% <100%> (-3.36%) ⬇️
src/affexpr.jl 97.45% <100%> (ø) ⬆️
src/quadexpr.jl 92.4% <100%> (ø) ⬆️
src/macros.jl 88.48% <100%> (ø) ⬆️
src/variables.jl 88.55% <100%> (ø) ⬆️
src/operators.jl 96.69% <0%> (-0.02%) ⬇️
... and 1 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 1d94c4b...1d0e847. Read the comment docs.

src/print.jl Outdated
@@ -130,7 +130,17 @@ function var_str(::Type{IJuliaMode}, v::AbstractVariableRef; mathmode=true)
return math("noname", mathmode)
end
end

# We want arrays of variables to be printed with `JuMP.VariableRef` as eltype
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Eh, this is another chunk of code that will break between julia versions. Weird printing will also confuse developers when they try to create arrays to store VariableRefs.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, but it improves printing :-P Ok I'll revert this ^^

@blegat
Copy link
Member Author

blegat commented Jun 18, 2018

I have added a benchmark, see the initial comment for the results of the benchmark

@blegat
Copy link
Member Author

blegat commented Nov 13, 2018

The overhead of having an untyped backend is now 1/3 for both time and space in @variable(model).
With Julia v1.0.2:

julia> using JuMP

julia> const model = Model()
A JuMP Model

julia> using BenchmarkTools

julia> @btime @variable(model)
  33.454 ns (2 allocations: 48 bytes)
noname

By annotating the type of the backend in https://github.com/JuliaOpt/JuMP.jl/blob/a0af39df82fce30cce0821f4f08ec7b13d710b01/src/variables.jl#L206
we get

julia> using JuMP

julia> const model = Model()
A JuMP Model

julia> using BenchmarkTools

julia> @btime @variable(model)
  18.558 ns (1 allocation: 32 bytes)
noname

@mlubin
Copy link
Member

mlubin commented Nov 13, 2018

I was looking into this a bit last night. I couldn't figure out why exactly Julia is allocating (I couldn't make a small example of an allocation when dispatching on an object whose type is unknown). Maybe there's a trick to fix it.

Another option without parameterizing the model is to do the work in batch and call add_variables instead of add_variable when we want to create 1M variables.

@blegat
Copy link
Member Author

blegat commented Nov 13, 2018

The batch idea could help. For the 32 remaining bytes, it is due to the fact that VariableRef is not isbits because it holds the models as a field and the model is not isbits

@odow
Copy link
Member

odow commented Jan 9, 2019

What is the status of this?

@blegat blegat mentioned this pull request Feb 3, 2019
@odow
Copy link
Member

odow commented Sep 27, 2021

Closing as stale. There are open PRs exploring alternatives: #2520 and #2610

@odow odow closed this Sep 27, 2021
@odow odow deleted the bl/ot branch November 10, 2021 02:29
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Development

Successfully merging this pull request may close these issues.

3 participants