Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Type-stability issues #1824

Closed
odow opened this issue Feb 3, 2019 · 2 comments
Closed

Type-stability issues #1824

odow opened this issue Feb 3, 2019 · 2 comments
Assignees

Comments

@odow
Copy link
Member

odow commented Feb 3, 2019

I have some type stability issues that I think are the cause of some performance issues I'm having.

Is it reasonable to expect that calls such as value and dual will return Float64s?

model = JuMP.Model(with_optimizer(Gurobi.Optimizer, OutputFlag=0))
@variable(model, x)
JuMP.optimize!(model)

@code_warntype JuMP.value(x)
Body::Any
705 1%1 = MathOptInterface.get::Core.Compiler.Const(MathOptInterface.get, false)
    │   %2 = (Base.getfield)(v, :model)::Model                   │╻  owner_model
    │   %3 = %new(MathOptInterface.VariablePrimal, 1)::MathOptInterface.VariablePrimal%4 = invoke %1(%2::Model, %3::MathOptInterface.VariablePrimal, _2::VariableRef)::Any
    └──      return %4
@blegat
Copy link
Member

blegat commented Feb 3, 2019

Is it due to the fact that the moi_backend type is not concrete (e.g. what is addressed by #1348) ?
JuMP only support Float64 arithmetic at the moment so adding a type assert seems reasonable.

@odow
Copy link
Member Author

odow commented Feb 3, 2019

Yeah it's probably due to the non-concreteness. Here is the function that fails to infer properly when is_set_by_optimize is true:

https://github.com/JuliaOpt/JuMP.jl/blob/1d7c5030ce50300f564b1cde1e1ecfabb82c5494/src/JuMP.jl#L573-L581

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

No branches or pull requests

2 participants