CONOPT.jl is a Julia wrapper for the CONOPT solver.
It has two components:
- a thin wrapper around the C API
- an interface to MathOptInterface.
This wrapper is maintained by GAMS Software GmbH.
CONOPT.jl is licensed under the MIT License.
The underlying solver, CONOPT, is proprietary software from GAMS. There are various licensing options for CONOPT. These include Demo, Evaluation, Academic and Full licences.
The Academic license gives you full access to CONOPT, if you satisfy the following:
- affiliated with a recognized academic institution, and
- intent to use CONOPT for non-commercial teaching and research.
There are a number of ways to provide a license to CONOPT.jl. They are loaded with the following precedence (from highest to lowest):
You can provide the license details as raw optimizer attributes when creating a JuMP model:
using JuMP, CONOPT
model = Model(CONOPT.Optimizer)
set_attribute(model, "license_int_1", license_int_1)
set_attribute(model, "license_int_2", license_int_2)
set_attribute(model, "license_int_3", license_int_3)
set_attribute(model, "license_string", "your-license-string")Alternatively, when using the low-level C API, you can pass the license information directly to the CONOPT.ConoptModel constructor.
The recommended way to provide a license is to save it to your local environment for the current project.
Use the CONOPT.set_license function with the integers and string from your GAMS license file:
import CONOPT
CONOPT.set_license(license_int_1, license_int_2, license_int_3, "your-license-string")This saves the license details to your LocalPreferences.toml file, so you only need to do this once per project.
You can also provide the license via environment variables. This is useful for CI or other automated environments.
export CONOPT_LICENSE_INT_1=<license_int_1>
export CONOPT_LICENSE_INT_2=<license_int_2>
export CONOPT_LICENSE_INT_3=<license_int_3>
export CONOPT_LICENSE_STRING="<your-license-string>"Contact GAMS support if you encounter any problems using this interface or the solver.
If you have a reproducible example of a bug, please open a GitHub issue.
To use CONOPT.jl, you must have a local installation of the CONOPT solver libraries. Please see the CONOPT website for information on obtaining CONOPT.
CONOPT.jl needs to know the location of the CONOPT shared library (e.g., libconopt.so, conopt.dll, or conopt.dylib).
Tell CONOPT.jl where to find the library by calling CONOPT.set_library_path:
import CONOPT
# This is an example, use the actual path to your CONOPT library
CONOPT.set_library_path("/path/to/your/conopt/library/libconopt.so")This preference is saved to a LocalPreferences.toml file in your current project. You will need to restart your Julia session for the change to take effect.
Once the library path is set, you can install CONOPT.jl using the Julia package manager:
import Pkg
Pkg.add("CONOPT")You can use CONOPT with JuMP as follows:
using JuMP, CONOPT
model = Model(CONOPT.Optimizer)
set_attribute(model, "lim_iteration", 100)
set_attribute(model, "log_level", 0)CONOPT.jl moves the CONOPT.Optimizer object to a package extension. As a
consequence, CONOPT.Optimizer is now type unstable, and it will be inferred as
CONOPT.Optimizer()::Any.
In most cases, this should not impact performance. If it does, there are two work-arounds.
First, you can use a function barrier:
using JuMP, CONOPT
function main(optimizer::T) where {T}
model = Model(optimizer)
return
end
main(CONOPT.Optimizer)Although the outer CONOPT.Optimizer is type unstable, the optimizer inside
main will be properly inferred.
Second, you may explicitly get and use the extension module:
using JuMP, CONOPT
const CONOPTMathOptInterfaceExt =
Base.get_extension(CONOPT, :ConoptMathOptInterfaceExt)
model = Model(ConoptMathOptInterfaceExt.Optimizer)The CONOPT optimizer supports the following constraints and attributes.
List of supported objective functions:
MOI.ObjectiveFunction{MOI.ScalarAffineFunction{Float64}}MOI.ObjectiveFunction{MOI.ScalarNonlinearFunction}MOI.ObjectiveFunction{MOI.ScalarQuadraticFunction{Float64}}
List of supported variable types:
List of supported constraint types:
MOI.ScalarAffineFunction{Float64}inMOI.EqualTo{Float64}MOI.ScalarAffineFunction{Float64}inMOI.GreaterThan{Float64}MOI.ScalarAffineFunction{Float64}inMOI.LessThan{Float64}MOI.ScalarNonlinearFunctioninMOI.EqualTo{Float64}MOI.ScalarNonlinearFunctioninMOI.GreaterThan{Float64}MOI.ScalarNonlinearFunctioninMOI.LessThan{Float64}MOI.ScalarQuadraticFunction{Float64}inMOI.EqualTo{Float64}MOI.ScalarQuadraticFunction{Float64}inMOI.GreaterThan{Float64}MOI.ScalarQuadraticFunction{Float64}inMOI.LessThan{Float64}MOI.VariableIndexinMOI.EqualTo{Float64}MOI.VariableIndexinMOI.GreaterThan{Float64}MOI.VariableIndexinMOI.Interval{Float64}MOI.VariableIndexinMOI.LessThan{Float64}MOI.VectorOfVariablesinMOI.HyperRectangle{Float64}
List of supported model attributes:
MOI.NameMOI.SilentMOI.TimeLimitSecMOI.NumberOfThreadsMOI.ObjectiveSenseMOI.SolveTimeSecMOI.BarrierIterations- NOTE: CONOPT executes a GRG algorithm, instead of a barrier algorithm. The iterations reported by this attribute are for the GRG algorithm.
A list of available options is provided in the CONOPT reference manual.
Set options using MOI.RawOptimizerAttribute:
set_attribute(model, "lim_iteration", 100)CONOPT.jl provides a low-level wrapper around the CONOPT C API, which is used by the MathOptInterface implementation.
The main entry point for the low-level API is the CONOPT.ConoptModel object. Using this object requires the user to manually manage memory and callbacks.
For a detailed example of how to use the C API, see the implementation of the MathOptInterface wrapper in ext/ConoptMathOptInterfaceExt/MOI_wrapper.jl.