erf() not defined when solving NL optimization problem with JuMP - julia

I'm solving a non-linear optimization problem with JuMP in Julia, there is an erf() called in my objective function. My code throws an error saying that erf() function is not defined.
I have called Pkg.update(), loaded the SpecialFunction package and swtiched among different solvers (NLopt, Itopt) but none of them worked.
Here follows the environment and the minimized code which I tested under Julia 1.0.3
julia> versioninfo()
Julia Version 1.0.3
Commit 099e826241 (2018-12-18 01:34 UTC)
Platform Info:
OS: Windows (x86_64-w64-mingw32)
CPU: Intel(R) Core(TM) i7 CPU 860 # 2.80GHz
WORD_SIZE: 64
LIBM: libopenlibm
LLVM: libLLVM-6.0.0 (ORCJIT, nehalem)
Environment:
JULIA_EDITOR = "C:\Users\binha\AppData\Local\atom\app-1.34.0\atom.exe" -a
JULIA_NUM_THREADS = 4
julia> Pkg.installed()
Dict{String,Union{Nothing, VersionNumber}} with 25 entries:
"Statistics" => nothing
"GR" => v"0.37.0"
"Distributions" => v"0.16.4"
"Random" => nothing
"Atom" => v"0.7.14"
"UUIDs" => nothing
"FastGaussQuadrature" => v"0.3.2"
"JuMP" => v"0.18.5"
"Juno" => v"0.5.4"
"LinearAlgebra" => nothing
"Ipopt" => v"0.5.1"
"NLopt" => v"0.5.1"
"PyCall" => v"1.18.5"
"LaTeXStrings" => v"1.0.3"
"SymPy" => v"0.8.3"
"StatsBase" => v"0.27.0"
"Plots" => v"0.22.5"
"PyPlot" => v"2.7.0"
"ProgressMeter" => v"0.9.0"
"QuadGK" => v"2.0.3"
"NLsolve" => v"3.0.1"
"SpecialFunctions" => v"0.7.2"
"ECOS" => v"0.9.4"
"PoissonRandom" => v"0.4.0"
"PlotlyJS" => v"0.12.2"
julia> using JuMP
julia> using NLopt
julia> using SpecialFunctions
julia> erf(1)
0.8427007929497149
julia> m=Model(solver=NLoptSolver(algorithm=:LD_MMA))
Feasibility problem with:
* 0 linear constraints
* 0 variables
Solver is NLopt
julia> #variable(m,x,start=0.0)
x
julia> #NLobjective(m,Min,erf(x))
JuMP.NonlinearExprData(ReverseDiffSparse.NodeData[NodeData(CALLUNIVAR, 53, -1), NodeData(VARIABLE, 1, 1)], Float64[])
julia> solve(m)
ERROR: UndefVarError: erf not defined
Stacktrace:
[1] #forward_eval#7(::ReverseDiffSparse.UserOperatorRegistry, ::Function, ::Array{Float64,1}, ::Array{Float64,1}, ::Array{ReverseDiffSparse.NodeData,1}, ::SparseArrays.SparseMatrixCSC{Bool,Int64}, ::Array{Float64,1}, ::Array{Float64,1}, ::Array{Float64,1}, ::Array{Float64,1}, ::Array{Float64,1}, ::Array{Float64,1}) at C:\Users\binha\.julia\packages\ReverseDiffSparse\gvK2V\src\forward.jl:380
[2] #forward_eval at .\none:0 [inlined]
[3] forward_eval_all(::JuMP.NLPEvaluator, ::Array{Float64,1}) at C:\Users\binha\.julia\packages\JuMP\PbnIJ\src\nlp.jl:445
[4] eval_grad_f(::JuMP.NLPEvaluator, ::Array{Float64,1}, ::Array{Float64,1}) at C:\Users\binha\.julia\packages\JuMP\PbnIJ\src\nlp.jl:496
[5] initialize(::JuMP.NLPEvaluator, ::Array{Symbol,1}) at C:\Users\binha\.julia\packages\JuMP\PbnIJ\src\nlp.jl:403
[6] loadproblem!(::NLopt.NLoptMathProgModel, ::Int64, ::Int64, ::Array{Float64,1}, ::Array{Float64,1}, ::Array{Float64,1}, ::Array{Float64,1}, ::Symbol, ::JuMP.NLPEvaluator) at C:\Users\binha\.julia\packages\NLopt\eqN9a\src\NLoptSolverInterface.jl:117
[7] _buildInternalModel_nlp(::Model, ::JuMP.ProblemTraits) at C:\Users\binha\.julia\packages\JuMP\PbnIJ\src\nlp.jl:1244
[8] #build#123(::Bool, ::Bool, ::JuMP.ProblemTraits, ::Function, ::Model) at C:\Users\binha\.julia\packages\JuMP\PbnIJ\src\solvers.jl:304
[9] #build at .\none:0 [inlined]
[10] #solve#120(::Bool, ::Bool, ::Bool, ::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}, ::Function, ::Model) at C:\Users\binha\.julia\packages\JuMP\PbnIJ\src\solvers.jl:168
[11] solve(::Model) at C:\Users\binha\.julia\packages\JuMP\PbnIJ\src\solvers.jl:150
[12] top-level scope at none:0
Is there any way to call erf() here with the solver?

Registering a custom function should help here:
julia> using JuMP, NLopt, SpecialFunctions
julia> f(x) = erf(x)
f (generic function with 1 method)
julia> m=Model(solver=NLoptSolver(algorithm=:LD_MMA))
Feasibility problem with:
* 0 linear constraints
* 0 variables
Solver is NLopt
julia> JuMP.register(m, :f, 1, f, autodiff=true)
julia> #variable(m,x,start=0.0)
x
julia> #NLobjective(m,Min,f(x))
JuMP.NonlinearExprData(ReverseDiffSparse.NodeData[NodeData(CALLUNIVAR, 73, -1), NodeData(VARIABLE, 1, 1)], Float64[])
julia> solve(m)
:Optimal
julia> getvalue(x)
-4.107514124511155

Related

Julia Roots find_zero with ForwardDiff.Dual type?

I'm trying to apply automatic differentiation (ForwardDiff) to a function that contains an instance of find_zero (Roots) and am encountering an error that seems to relate to find_zero not accepting the ForwardDiff.Dual type.
Here's a (contrived) minimal working example that illustrates the issue:
using Distributions
using Roots
using StatsFuns
using ForwardDiff
function test_fun(θ::AbstractVector{T}) where T
μ,σ,p = θ;
z_star = find_zero(z -> logistic(z) - p, 0.0)
return pdf(Normal(μ,σ),z_star)
end
test_fun([0.0,1.0,0.75])
ForwardDiff.gradient(test_fun,[0.0,1.0,0.75])
This results in the following error:
ERROR: MethodError: no method matching Float64(::ForwardDiff.Dual{ForwardDiff.Tag{typeof(test_fun),Float64},Float64,3})
Closest candidates are:
Float64(::Real, ::RoundingMode) where T<:AbstractFloat at rounding.jl:200
Float64(::T) where T<:Number at boot.jl:716
Float64(::Irrational{:invsqrt2}) at irrationals.jl:189
...
Stacktrace:
[1] convert(::Type{Float64}, ::ForwardDiff.Dual{ForwardDiff.Tag{typeof(test_fun),Float64},Float64,3}) at ./number.jl:7
[2] setproperty!(::Roots.UnivariateZeroState{Float64,ForwardDiff.Dual{ForwardDiff.Tag{typeof(test_fun),Float64},Float64,3}}, ::Symbol, ::ForwardDiff.Dual{ForwardDiff.Tag{typeof(test_fun),Float64},Float64,3}) at ./Base.jl:34
[3] update_state(::Roots.Secant, ::Roots.DerivativeFree{Roots.DerivativeFree{var"#5#6"{ForwardDiff.Dual{ForwardDiff.Tag{typeof(test_fun),Float64},Float64,3}}}}, ::Roots.UnivariateZeroState{Float64,ForwardDiff.Dual{ForwardDiff.Tag{typeof(test_fun),Float64},Float64,3}}, ::Roots.UnivariateZeroOptions{Float64,Float64,ForwardDiff.Dual{ForwardDiff.Tag{typeof(test_fun),Float64},Float64,3},ForwardDiff.Dual{ForwardDiff.Tag{typeof(test_fun),Float64},Float64,3}}) at /bbkinghome/asharris/.julia/packages/Roots/TZpjF/src/derivative_free.jl:163
[4] find_zero(::Roots.Secant, ::Roots.AlefeldPotraShi, ::Roots.DerivativeFree{Roots.DerivativeFree{var"#5#6"{ForwardDiff.Dual{ForwardDiff.Tag{typeof(test_fun),Float64},Float64,3}}}}, ::Roots.UnivariateZeroState{Float64,ForwardDiff.Dual{ForwardDiff.Tag{typeof(test_fun),Float64},Float64,3}}, ::Roots.UnivariateZeroOptions{Float64,Float64,ForwardDiff.Dual{ForwardDiff.Tag{typeof(test_fun),Float64},Float64,3},ForwardDiff.Dual{ForwardDiff.Tag{typeof(test_fun),Float64},Float64,3}}, ::Roots.NullTracks) at /bbkinghome/asharris/.julia/packages/Roots/TZpjF/src/find_zero.jl:868
[5] find_zero(::Roots.DerivativeFree{var"#5#6"{ForwardDiff.Dual{ForwardDiff.Tag{typeof(test_fun),Float64},Float64,3}}}, ::Float64, ::Roots.Secant, ::Roots.AlefeldPotraShi; tracks::Roots.NullTracks, verbose::Bool, p::Nothing, kwargs::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}) at /bbkinghome/asharris/.julia/packages/Roots/TZpjF/src/find_zero.jl:689
[6] #find_zero#36 at /bbkinghome/asharris/.julia/packages/Roots/TZpjF/src/derivative_free.jl:123 [inlined]
[7] find_zero at /bbkinghome/asharris/.julia/packages/Roots/TZpjF/src/derivative_free.jl:120 [inlined]
[8] #find_zero#5 at /bbkinghome/asharris/.julia/packages/Roots/TZpjF/src/find_zero.jl:707 [inlined]
[9] find_zero at /bbkinghome/asharris/.julia/packages/Roots/TZpjF/src/find_zero.jl:707 [inlined]
[10] test_fun at ./REPL[7856]:3 [inlined]
[11] vector_mode_dual_eval at /bbkinghome/asharris/.julia/packages/ForwardDiff/QOqCN/src/apiutils.jl:37 [inlined]
[12] vector_mode_gradient(::typeof(test_fun), ::Array{Float64,1}, ::ForwardDiff.GradientConfig{ForwardDiff.Tag{typeof(test_fun),Float64},Float64,3,Array{ForwardDiff.Dual{ForwardDiff.Tag{typeof(test_fun),Float64},Float64,3},1}}) at /bbkinghome/asharris/.julia/packages/ForwardDiff/QOqCN/src/gradient.jl:106
[13] gradient(::Function, ::Array{Float64,1}, ::ForwardDiff.GradientConfig{ForwardDiff.Tag{typeof(test_fun),Float64},Float64,3,Array{ForwardDiff.Dual{ForwardDiff.Tag{typeof(test_fun),Float64},Float64,3},1}}, ::Val{true}) at /bbkinghome/asharris/.julia/packages/ForwardDiff/QOqCN/src/gradient.jl:19
[14] gradient(::Function, ::Array{Float64,1}, ::ForwardDiff.GradientConfig{ForwardDiff.Tag{typeof(test_fun),Float64},Float64,3,Array{ForwardDiff.Dual{ForwardDiff.Tag{typeof(test_fun),Float64},Float64,3},1}}) at /bbkinghome/asharris/.julia/packages/ForwardDiff/QOqCN/src/gradient.jl:17 (repeats 2 times)
[15] top-level scope at REPL[7858]:1
[16] run_repl(::REPL.AbstractREPL, ::Any) at /builddir/build/BUILD/julia/build/usr/share/julia/stdlib/v1.5/REPL/src/REPL.jl:288
I have limited experience using the FowardDiff package and am probably misunderstanding how the Dual type works, so I would really appreciate if someone knows how to solve this issue. Thanks so much!
z_star = find_zero(z -> logistic(z) - p, 0.0)
You have a fixed initial condition which is non-dual. Make it dual.
z_star = find_zero(z -> logistic(z) - p, zero(eltype(θ))

Interact.jl error: WebIOServer not defined

I am getting this error when using the Interact.jl #manipulate macro in Juno:
UndefVarError: WebIOServer not defined
enter code here
setup_server() at webio.jl:76
show(::IOContext{Base.GenericIOBuffer{Array{UInt8,1}}}, ::MIME{Symbol("application/prs.juno.plotpane+html")}, ::Widget{:manipulate,Any}) at webio.jl:68
show(::IOContext{Base.GenericIOBuffer{Array{UInt8,1}}}, ::String, ::Widget{:manipulate,Any}) at multimedia.jl:109
displayinplotpane(::Widget{:manipulate,Any}) at showdisplay.jl:51
displayandrender(::Widget{:manipulate,Any}) at showdisplay.jl:131
(::Atom.var"#208#213"{String})() at eval.jl:136
#invokelatest#1 at essentials.jl:712 [inlined]
invokelatest at essentials.jl:711 [inlined]
macro expansion at dynamic.jl:24 [inlined]
eval(::String, ::Int64, ::String, ::String, ::Bool) at eval.jl:113
invokelatest(::Any, ::Any, ::Vararg{Any,N} where N; kwargs::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{,Tuple{}}}) at essentials.jl:712
invokelatest(::Any, ::Any, ::Vararg{Any,N} where N) at essentials.jl:711
macro expansion at eval.jl:41 [inlined]
(::Atom.var"#188#189")() at task.jl:358
Piece of code (the function generatePlot is defined previously for generating Makie plots):
#manipulate for variable = Dict(string(vars[i])=>vars[i] for i=1:16),
labl = Dict(string(label[i])=>label[i] for i=1:length(label))
generatePlot(variable, labl)
end
Env:
[c601a237] Interact v0.10.3
[ee78f7c6] Makie v0.11.0
[85f8d34a] NCDatasets v0.10.2
[2913bbd2] StatsBase v0.33.0
[0f1e0344] WebIO v0.8.14
Any thoughts on what can be the problem?

How to overload Base.show for custom array types?

Suppose I make my own custom vector type with it's own custom show method:
struct MyVector{T} <: AbstractVector{T}
v::Vector{T}
end
function Base.show(io::IO, v::MyVector{T}) where {T}
println(io, "My custom vector with eltype $T with elements")
for i in eachindex(v)
println(io, " ", v.v[i])
end
end
If I try making one of these objects at the REPL I get unexpected errors related to functions I never intended to call:
julia> MyVector([1, 2, 3])
Error showing value of type MyVector{Int64}:
ERROR: MethodError: no method matching size(::MyVector{Int64})
Closest candidates are:
size(::AbstractArray{T,N}, ::Any) where {T, N} at abstractarray.jl:38
size(::BitArray{1}) at bitarray.jl:77
size(::BitArray{1}, ::Integer) at bitarray.jl:81
...
Stacktrace:
[1] axes at ./abstractarray.jl:75 [inlined]
[2] summary(::IOContext{REPL.Terminals.TTYTerminal}, ::MyVector{Int64}) at ./show.jl:1877
[3] show(::IOContext{REPL.Terminals.TTYTerminal}, ::MIME{Symbol("text/plain")}, ::MyVector{Int64}) at ./arrayshow.jl:316
[4] display(::REPL.REPLDisplay, ::MIME{Symbol("text/plain")}, ::Any) at /Users/mason/julia/usr/share/julia/stdlib/v1.3/REPL/src/REPL.jl:132
[5] display(::REPL.REPLDisplay, ::Any) at /Users/mason/julia/usr/share/julia/stdlib/v1.3/REPL/src/REPL.jl:136
[6] display(::Any) at ./multimedia.jl:323
...
Okay, whatever so I'll implement Base.size so it'll leave me alone:
julia> Base.size(v::MyVector) = size(v.v)
julia> MyVector([1, 2, 3])
3-element MyVector{Int64}:
Error showing value of type MyVector{Int64}:
ERROR: getindex not defined for MyVector{Int64}
Stacktrace:
[1] error(::String, ::Type) at ./error.jl:42
[2] error_if_canonical_getindex(::IndexCartesian, ::MyVector{Int64}, ::Int64) at ./abstractarray.jl:991
[3] _getindex at ./abstractarray.jl:980 [inlined]
[4] getindex at ./abstractarray.jl:981 [inlined]
[5] isassigned(::MyVector{Int64}, ::Int64, ::Int64) at ./abstractarray.jl:405
[6] alignment(::IOContext{REPL.Terminals.TTYTerminal}, ::MyVector{Int64}, ::UnitRange{Int64}, ::UnitRange{Int64}, ::Int64, ::Int64, ::Int64) at ./arrayshow.jl:67
[7] print_matrix(::IOContext{REPL.Terminals.TTYTerminal}, ::MyVector{Int64}, ::String, ::String, ::String, ::String, ::String, ::String, ::Int64, ::Int64) at ./arrayshow.jl:186
[8] print_matrix at ./arrayshow.jl:159 [inlined]
[9] print_array at ./arrayshow.jl:308 [inlined]
[10] show(::IOContext{REPL.Terminals.TTYTerminal}, ::MIME{Symbol("text/plain")}, ::MyVector{Int64}) at ./arrayshow.jl:345
[11] display(::REPL.REPLDisplay, ::MIME{Symbol("text/plain")}, ::Any) at /Users/mason/julia/usr/share/julia/stdlib/v1.3/REPL/src/REPL.jl:132
[12] display(::REPL.REPLDisplay, ::Any) at /Users/mason/julia/usr/share/julia/stdlib/v1.3/REPL/src/REPL.jl:136
[13] display(::Any) at ./multimedia.jl:323
...
Hmm, now it wants getindex
julia> Base.getindex(v::MyVector, args...) = getindex(v.v, args...)
julia> MyVector([1, 2, 3])
3-element MyVector{Int64}:
1
2
3
What? That wasn't the print formatting I told it to do! what's going on here?
The problem is that in julia, Base defines a method Base.show(io::IO ::MIME"text/plain", X::AbstractArray) which is actually more specific than the Base.show(io::IO, v::MyVector) for the purposes of display. This section of the julia manual describes the sort of custom printing that AbstractArray uses. So if we want to use our custom show method, we instead need to do
julia> function Base.show(io::IO, ::MIME"text/plain", v::MyVector{T}) where {T}
println(io, "My custom vector with eltype $T and elements")
for i in eachindex(v)
println(io, " ", v.v[i])
end
end
julia> MyVector([1, 2, 3])
My custom vector with eltype Int64 and elements
1
2
3
See also: https://discourse.julialang.org/t/extending-base-show-for-array-of-types/31289

Saving an OrderedDict to Julia Data Format

I wish to use the JLD package to write an OrderedDict to file in such a way that I can subsequently read it back unchanged.
Here was my first effort:
using JLD, HDF5, DataStructures
function testjld()
res = OrderedDict("A" => 1, "B" => 2)
filename = "c:/temp/test.jld"
save(File(format"JLD", filename), "res", res)
res2 = load(filename)["res"]
#Check if round-tripping works
res == res2
end
But the "round-tripping" doesn't work - the function returns false. It also raises a warning:
julia> testjld()
┌ Warning: type JLD.AssociativeWrapper{Core.String,Core.Int64,OrderedCollections.OrderedDict{Core.String,Core.Int64}} not present in workspace; reconstructing
└ # JLD C:\Users\Philip\.julia\packages\JLD\1BoSz\src\jld_types.jl:703
false
After reading the docs, I thought that JLD does not support OrderedDict "out of the box", but does support Dict and I can use that fact to write my own custom serialisation for OrderedDict. Something like this:
struct OrderedDictSerializer
d::Dict
end
JLD.writeas(data::OrderedDict) = OrderedDictSerializer(Dict("contents" => convert(Dict, data),
"keyorder" => [k for (k, v) in data]))
function JLD.readas(serdata::OrderedDictSerializer)
unordered = serdata.d["contents"]
keyorder = serdata.d["keyorder"]
OrderedDict((k, unordered[k]) for k in keyorder)
end
Hardly an exhaustive test, but this does seem to work:
julia> testjld()
true
Am I correct in thinking I need to write my own serializer for OrderedDict, and can my serializer be improved?
EDIT
The answer to to my question "Can my serializer be improved?" seems to be "It will have to be, though I don't yet understand how."
Consider the two following test functions:
function testjld2()
res = OrderedDict("A" => [1.0,2.0],"B" => [3.0,4.0])
#check if round-tripping of readas and writeas methods works:
JLD.readas(JLD.writeas(res)) == res
end
function testjld3()
res = OrderedDict("A" => [1.0,2.0],"B" => [3.0,4.0])
filename = "c:/temp/test.jld"
save(File(format"JLD", filename), "res", res)
res2 = load(filename)["res"]
#Check if round-tripping to jld file and back works
res == res2
end
testjld2 shows that my writeas and readas methods correctly round-trip for an OrderedDict{String,Array{Float64,1}} with 2 entries
julia> testjld2()
true
and yet testjld3 doesn't work at all, but yields an error:
julia> testjld3()
HDF5-DIAG: Error detected in HDF5 (1.10.5) thread 0:
#000: E:/mingwbuild/mingw-w64-hdf5/src/hdf5-1.10.5/src/H5Tfields.c line 60 in H5Tget_nmembers(): not a datatype
major: Invalid arguments to routine
minor: Inappropriate type
HDF5-DIAG: Error detected in HDF5 (1.10.5) thread 0:
#000: E:/mingwbuild/mingw-w64-hdf5/src/hdf5-1.10.5/src/H5Tfields.c line 60 in H5Tget_nmembers(): not a datatype
major: Invalid arguments to routine
minor: Inappropriate type
ERROR: Error getting the number of members
Stacktrace:
[1] error(::String) at .\error.jl:33
[2] h5t_get_nmembers at C:\Users\Philip\.julia\packages\HDF5\rF1Fe\src\HDF5.jl:2279 [inlined]
[3] _gen_h5convert!(::Any) at C:\Users\Philip\.julia\packages\JLD\1BoSz\src\jld_types.jl:638
[4] #s27#9(::Any, ::Any, ::Any, ::Any, ::Any, ::Any) at C:\Users\Philip\.julia\packages\JLD\1BoSz\src\jld_types.jl:664
[5] (::Core.GeneratedFunctionStub)(::Any, ::Vararg{Any,N} where N) at .\boot.jl:524
[6] #write_compound#24(::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}, ::typeof(JLD.write_compound), ::JLD.JldGroup, ::String, ::JLD.AssociativeWrapper{String,Any,Dict{String,Any}}, ::JLD.JldWriteSession) at C:\Users\Philip\.julia\packages\JLD\1BoSz\src\JLD.jl:700
[7] write_compound at C:\Users\Philip\.julia\packages\JLD\1BoSz\src\JLD.jl:694 [inlined]
[8] #_write#23 at C:\Users\Philip\.julia\packages\JLD\1BoSz\src\JLD.jl:690 [inlined]
[9] _write at C:\Users\Philip\.julia\packages\JLD\1BoSz\src\JLD.jl:690 [inlined]
[10] write_ref(::JLD.JldFile, ::Dict{String,Any}, ::JLD.JldWriteSession) at C:\Users\Philip\.julia\packages\JLD\1BoSz\src\JLD.jl:658
[11] macro expansion at C:\Users\Philip\.julia\packages\JLD\1BoSz\src\jld_types.jl:648 [inlined]
[12] h5convert!(::Ptr{UInt8}, ::JLD.JldFile, ::OrderedDictSerializer, ::JLD.JldWriteSession) at C:\Users\Philip\.julia\packages\JLD\1BoSz\src\jld_types.jl:664
[13] #write_compound#24(::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}, ::typeof(JLD.write_compound), ::JLD.JldFile, ::String, ::OrderedDictSerializer, ::JLD.JldWriteSession) at C:\Users\Philip\.julia\packages\JLD\1BoSz\src\JLD.jl:700
[14] write_compound at C:\Users\Philip\.julia\packages\JLD\1BoSz\src\JLD.jl:694 [inlined]
[15] #_write#23 at C:\Users\Philip\.julia\packages\JLD\1BoSz\src\JLD.jl:690 [inlined]
[16] _write at C:\Users\Philip\.julia\packages\JLD\1BoSz\src\JLD.jl:690 [inlined]
[17] #write#17(::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}, ::typeof(write), ::JLD.JldFile, ::String, ::OrderedDict{String,Array{Float64,1}}, ::JLD.JldWriteSession) at C:\Users\Philip\.julia\packages\JLD\1BoSz\src\JLD.jl:514
[18] write at C:\Users\Philip\.julia\packages\JLD\1BoSz\src\JLD.jl:514 [inlined]
[19] #35 at C:\Users\Philip\.julia\packages\JLD\1BoSz\src\JLD.jl:1223 [inlined]
[20] #jldopen#14(::Base.Iterators.Pairs{Symbol,Bool,Tuple{Symbol,Symbol},NamedTuple{(:compatible, :compress),Tuple{Bool,Bool}}}, ::typeof(jldopen), ::getfield(JLD, Symbol("##35#36")){String,OrderedDict{String,Array{Float64,1}},Tuple{}},
::String, ::Vararg{String,N} where N) at C:\Users\Philip\.julia\packages\JLD\1BoSz\src\JLD.jl:246
[21] testjld3() at .\none:0
[22] top-level scope at REPL[48]:1
Use JLD2 instead:
using JLD2, DataStructures, FileIO
function testjld2()
res = OrderedDict("A" => 1, "B" => 2)
myfilename = "c:/temp/test.jld2"
save(myfilename, "res", res)
res2 = load(myfilename)["res"]
#Check if round-tripping works
res == res2
end
Testing:
julia> testjld2()
true
Personally, whenever I can I use BJSON:
using DataStructures, BSON, OrderedCollections
function testbson()
res = OrderedDict("A" => 1, "B" => 2)
myfilename = "c:/temp/test.bjson"
BSON.bson(myfilename, Dict("res" => res))
res2 = BSON.load(myfilename)["res"]
#Check if round-tripping works
res == res2
end
julia> testbson()
true

LoadError:PyCall.PyError ("\$(Expr(:escape, :(ccall(#= /home/omkar/.julia/packages/PyCall/ttONZ/src/pyfncall.jl:44 =# #pysym(:PyObject_Call)

I am getting the following Error for PyCall and am on Julia v 1.2.0
ERROR: LoadError: PyCall.PyError("\$(Expr(:escape, :(ccall(#= /home/omkar/.julia/packages/PyCall/ttONZ/src/pyfncall.jl:44 =# #pysym(:PyObject_Call), PyPtr, (PyPtr, PyPtr, PyPtr), o, pyargsptr, kw))))", PyCall.PyObject(Ptr{PyCall.PyObject_struct} #0x00007f9ea56a2520), PyCall.PyObject(Ptr{PyCall.PyObject_struct} #0x00007f9db65dcca8), PyCall.PyObject(Ptr{PyCall.PyObject_struct} #0x00007f9d004f9f08))
Stacktrace:
[1] pyerr_check at /home/omkar/.julia/packages/PyCall/ttONZ/src/exception.jl:60 [inlined]
[2] pyerr_check at /home/omkar/.julia/packages/PyCall/ttONZ/src/exception.jl:64 [inlined]
[3] macro expansion at /home/omkar/.julia/packages/PyCall/ttONZ/src/exception.jl:84 [inlined]
[4] __pycall!(::PyCall.PyObject, ::Ptr{PyCall.PyObject_struct}, ::PyCall.PyObject, ::Ptr{Nothing}) at /home/omkar/.julia/packages/PyCall/ttONZ/src/pyfncall.jl:44
[5] _pycall!(::PyCall.PyObject, ::PyCall.PyObject, ::Tuple{String,String,String,String,String,Float64,String,Float64}, ::Int64, ::Ptr{Nothing}) at /home/omkar/.julia/packages/PyCall/ttONZ/src/pyfncall.jl:29
[6] _pycall!(::PyCall.PyObject, ::PyCall.PyObject, ::Tuple{String,String,String,String,String,Float64,String,Float64}, ::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}) at /home/omkar/.julia/packages/PyCall/ttONZ/src/pyfncall.jl:11
[7] #call#111(::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}, ::PyCall.PyObject, ::String, ::Vararg{Any,N} where N) at /home/omkar/.julia/packages/PyCall/ttONZ/src/pyfncall.jl:89
[8] (::PyCall.PyObject)(::String, ::Vararg{Any,N} where N) at /home/omkar/.julia/packages/PyCall/ttONZ/src/pyfncall.jl:89
I have already tried update and below is the setup for the build file
ENV["PYTHON"] = " "
Pkg.build("PyCall")
using PyCall, Conda
pyexec = pyimport("sys").executable

Resources