In Symbolics.jl, I can formulate a set of equations purely symbolically.
I can for example define this differential equation using #syms:
using Symbolics
#syms α ρ[1:2, 1:2] dαdt dρdt[1:2, 1:2]
eqs = []
push!(eqs, dαdt == α*(ρ[1,1] +ρ[1,2] + ρ[2,1] + ρ[2,2]))
for i in 1:2, j in 1:2
push!(eqs, dρdt[i,j] == α*ρ[j,i])
end
eqs then has the form:
5-element Vector{Any}:
dαdt == (α*(ρ[1, 1] + ρ[1, 2] + ρ[2, 1] + ρ[2, 2]))
dρdt[1, 1] == (α*ρ[1, 1])
dρdt[1, 2] == (α*ρ[2, 1])
dρdt[2, 1] == (α*ρ[1, 2])
dρdt[2, 2] == (α*ρ[2, 2])
To solve the above equation using e.g. ModelingToolkit.jl, the symbolic parameters must be replaced by variables, e.g.
#variables t::Real, αvar(t)::Complex{Real}, ρvar(t)[1:2, 1:2]::Complex{Real}
Question: How can I transform the above symbolic equation to the correct variables, so that it can be solved using e.g. an ODEProblem?
Remark: A trivial solution is of course to just use the variables as defined above in the first place. This is however not the point of this question.
Something like this may work:
using Symbolics
#syms τ α ρ[1:2, 1:2]
D = Differential(τ)
eqs = []
push!(eqs, D(α) ~ α*(ρ[1,1] +ρ[1,2] + ρ[2,1] + ρ[2,2]))
for i in 1:2, j in 1:2
push!(eqs, D(ρ[i,j]) ~ α*ρ[j,i])
end
and then
using ModelingToolkit
#parameters t
#variables αvar(t), ρvar(t)[1:2, 1:2]
subs = Dict(τ => t, α => αvar, ρ => ρvar)
eqs_subbed = substitute.(eqs, Ref(subs))
Related
How to get coefficients for ALL combinations of the variables of a multivariable polynomial using sympy.jl or another Julia package for symbolic computation?
Here is an example from MATLAB,
syms a b y
[cxy, txy] = coeffs(ax^2 + by, [y x], ‘All’)
cxy =
[ 0, 0, b]
[ a, 0, 0]
txy =
[ x^2y, xy, y]
[ x^2, x, 1]
My goal is to get
[ x^2y, xy, y]
[ x^2, x, 1]
instead of [x^2, y]
I asked the same question at
https://github.com/JuliaPy/SymPy.jl/issues/482
and
https://discourse.julialang.org/t/symply-jl-for-getting-coefficients-for-all-combination-of-the-variables-of-a-multivariable-polynomial/89091
but I think I should ask if this can be done using Sympy.py.
Using Julia, I tried the following,
julia> #syms x, y, a, b
julia> ff = sympy.Poly(ax^2 + by, (x,y))
Poly(ax**2 + by, x, y, domain='ZZ[a,b]')
julia> [prod(ff.gens.^i) for i in ff.monoms()]
2-element Vector{Sym}:
x^2
y
This is a longer form rewrite of the one-liner in the comment.
It uses Pipe.jl to write expressions 'functionally', so familiarity with pipe operator (|>) and Pipe.jl will help.
using SymPy
using Pipe
#syms x, y, a, b
ff = sympy.Poly(a*x^2 + b*y, (x,y))
max_degrees =
#pipe ff.monoms() .|> collect |> hcat(_...) |>
reduce(max, _, dims=2) |> vec
degree_iter =
#pipe max_degrees .|> UnitRange(0, _) |>
tuple(_...) |> CartesianIndices
result = [prod(ff.gens.^Tuple(I)) for I in degree_iter] |>
reverse |> eachcol |> collect
or using more of the python methods:
[prod(ff.gens.^I) for
I in Iterators.product((0:d for d in ff.degree.(ff.gens))...)] |>
reverse |> eachcol |> collect
Both give the desired result:
2-element Vector{...}:
[x^2*y, x*y, y]
[x^2, x, 1]
UPDATE:
In case there are more than 2 generators, the result needs to be a Array with higher dimension. The last bits of matrix transposes is immaterial and the expressions become:
Method 1:
max_degrees =
#pipe ff.monoms() .|> collect |> hcat(_...) |>
reduce(max, _, dims=2) |> vec
degree_iter =
#pipe max_degrees .|> UnitRange(0, _) |>
tuple(_...) |> CartesianIndices
result = [prod(ff.gens.^Tuple(I)) for I in degree_iter]
Method 2:
result = [prod(ff.gens.^Tuple(I)) for I in degree_iter]
Thanks a lot #Dan Getz. Your solution works for the TOY example from MATLAB. My real case is more complicated, which has more variables and polynominals. I tried your method for 3 variables,
using SymPy
#syms x, y, z, a, b
ff = sympy.Poly(a*x^2 + b*y + z^2 + x*y + y*z, (x, y, z))
[prod(ff.gens.^Tuple(I)) for I in CartesianIndices(tuple(UnitRange.(0,vec(reduce(max, hcat(collect.(ff.monoms())...), dims=1)))...))]
I got the following error,
ERROR: LoadError: DimensionMismatch: arrays could not be broadcast to a common size; got a dimension with lengths 3 and 5
Stacktrace:
How to generate your method to any number of variables with different degrees, e.g., x^3 + y^3 + z^3 + xyz + xy^2z?
You can find the degree of each of the two variables of interest and then use them to create the matrix of generators; you can use them to get the coefficients of interest. I am not sure what you expect if the equation were like a*x**2 + b*y + c...
>>> from sympy import *
>>> from sympy.abc import a, b, x, y
>>> eq = a*x**2 + b*y
>>> deg = lambda x: Poly(eq, x).degree() # helper to give degree in "x"
>>> v = (Matrix([x**i for i in range(deg(x),-1,-1)]
... )*Matrix([y**i for i in range(deg(y),-1,-1)]).T).T; v
Matrix([[x**2*y, x*y, y], [x**2, x, 1]])
>>> Matrix(*v.shape, [eq.coeff(i) if i.free_symbols else eq.as_coeff_Add()[0]
... for i in v])
Matrix([[0, 0, b], [a, 0, 0]])
From #jverzani (thanks)
using SymPy;
#syms a b x y;
eq = a*x^2 + b*y;
deg = x -> sympy.Poly(eq, x).degree();
xs, ys = [x^i for i ∈ range(deg(x):-1:0], [y^i for i ∈ deg(y):-1:0];
v = permutedims(xs .* permutedims(ys));
M = [x^2*y x*y y; x^2 x 1];
[length(free_symbols(i)) > 0 ? eq.coeff(i) : eq.as_coeff_add()[1] for i ∈ v];
[0 0 b; a 0 0]
I am trying to implement this function in Julia and I am not getting it. I think it's because of broadcasting, it doesn't seem to work with arrays.
When I write the relational operators with dot (like .> instead of >), the number of errors decreases, but it accuses "TypeError: non-boolean (BitVector) used in boolean context".
How can I fix this?
function Rulkov(N, X, Y, α)
global σ, μ
for n=1:1:N
if (X[n, 1]<=0)
X[n, 2] = α[n] / (1 - X[n, 1]) + Y[n, 1]
elseif (X[n, 1]>0 && X[n, 1]<(α .+ Y[n, 1]))
X[n, 2] = α[n] + Y[n, 1]
else
X[n, 2] = -1
end
Y[n, 2] = Y[n, 1] - μ*(X[n, 1] .+ 1) .+ μ*σ
end
return sum(X[:, 2])/N
end
Assuming that X and Y are matrices, X[n, 1] is a scalar and α .+ Y[n, 1] is a Vector, so there is no meaningful comparison between these objects. So, depending on what you want you can either use
all(X[n, 1] .< (α .+ Y[n, 1])))
or (probably more correct from the mathematical point of view)
X[n, 1] < minimum(α .+ Y[n, 1])
or non-allocating version of the previous calculation
X[n, 1] < minimum(x -> x + Y[n, 1], α)
or (as it was proposed in comments)
X[n, 1] < minimum(α) + Y[n, 1]
if, else only accepts a boolean. So I guess you need to call all or any on the BitVector first?
I am using Julia 0.6.2 and JuMP 0.18.5 (I can't use a more recent version since I need to use an old package).
Creating JuMP variables with conditions on the index lead to a JuMPDict instead of an Array.
For example:
m = Model(solver = CplexSolver())
# type of x: JuMP.JuMPDict{JuMP.Variable,2}
#variable(m, x[i in 1:3, j in 1:3; i < j] >= 0)
# type of y: JuMP.JuMPDict{JuMP.Variable,3}
#variable(m, y[i in 1:3, j in 1:3, k in 1:3; i < j] >= 0)
I would like to apply a function f to x and to y[:, :, k] for all k in 1:3. However, I don't know how to define such a generic function.
I tried to set the argument type of f to JuMP.JuMPDict{JuMP.Variable,2}:
function f(input::JuMP.JuMPDict{JuMP.Variable,2})
...
end
I can use the function on x but not on y:
f(x) # Works
for k in 1:3
f(y[:, :, k]) # does not work as y is not an array
end
My last idea was to convert y into several JuMP.JuMPDict{JuMP.Variable,2}:
function convertTo2D(dict3D::JuMP.JuMPDict{JuMP.Variable,3}, k::Int)
dict2D = JuMP.JuMPDict{JuMP.Variable,2}() # This line returns "ERROR: KeyError: key :model not found"
for (key, value) in keys(dict3D)
if key[3] == k
dict2D[(key[1], key[2])] = value # Not sure if it will work
end
end
return dict2D
end
If this was working I could use:
for k in 1:3
f(convertTd2D(y, k))
end
Do you know how I could fix convertTo2D or do what I want another way?
Anonymous variables solved my problem. Thanks to them I can successively create the variables of y in a for loop. Variable y is now an array of "2D dictionaries" rather than a "3D dictionaries":
y = Array{JuMP.JuMPDict{JuMP.Variable,2}, 1}([])
for k in 1:3
yk = #variable(m, [i in 1:3, j in 1:3; i < j] >= 0)
f(yk)
push!(y, yk)
end
I've tried to reproduce the model from a PYMC3 and Stan comparison. But it seems to run slowly and when I look at #code_warntype there are some things -- K and N I think -- which the compiler seemingly calls Any.
I've tried adding types -- though I can't add types to turing_model's arguments and things are complicated within turing_model because it's using autodiff variables and not the usuals. I put all the code into the function do_it to avoid globals, because they say that globals can slow things down. (It actually seems slower, though.)
Any suggestions as to what's causing the problem? The turing_model code is what's iterating, so that should make the most difference.
using Turing, StatsPlots, Random
sigmoid(x) = 1.0 / (1.0 + exp(-x))
function scale(w0::Float64, w1::Array{Float64,1})
scale = √(w0^2 + sum(w1 .^ 2))
return w0 / scale, w1 ./ scale
end
function do_it(iterations::Int64)::Chains
K = 10 # predictor dimension
N = 1000 # number of data samples
X = rand(N, K) # predictors (1000, 10)
w1 = rand(K) # weights (10,)
w0 = -median(X * w1) # 50% of elements for each class (number)
w0, w1 = scale(w0, w1) # unit length (euclidean)
w_true = [w0, w1...]
y = (w0 .+ (X * w1)) .> 0.0 # labels
y = [Float64(x) for x in y]
σ = 5.0
σm = [x == y ? σ : 0.0 for x in 1:K, y in 1:K]
#model turing_model(X, y, σ, σm) = begin
w0_pred ~ Normal(0.0, σ)
w1_pred ~ MvNormal(σm)
p = sigmoid.(w0_pred .+ (X * w1_pred))
#inbounds for n in 1:length(y)
y[n] ~ Bernoulli(p[n])
end
end
#time chain = sample(turing_model(X, y, σ, σm), NUTS(iterations, 200, 0.65));
# ϵ = 0.5
# τ = 10
# #time chain = sample(turing_model(X, y, σ), HMC(iterations, ϵ, τ));
return (w_true=w_true, chains=chain::Chains)
end
chain = do_it(1000)
I saw an outdated answer in the following thread (How to do "for all" in sum notation in Julia/JuMP)
which is 3 years old unfortunately, but it's exactly what I want. However the code fails due to a number of syntax errors as the sum() function has changed these past few years.
For my code, I found that the sum() function only works for one indexing variable i, but if I include another variable j, the function stops working. I'm also using jupyter notebook if that makes any difference. Any ideas?
Using JuMP
ZS = Model(with_optimizer(Gurobi.Optimizer))
P = [[10 -20];
[30 -40]]
#variable(ZS, x[1,1:2])
#variable(ZS, y[1:2,1])
#objective(ZS, Max, sum(x[i]*P[i,j]*y[j] for i=1:2 for j=1:2))
#constraint(ZS, con1, x[1] + x[2] <= 1)
#constraint(ZS, con2, y[1] + y[2] <= 1)
optimize!(ZS)
For this example of code, I received a "key not found" error
Seems like you need an update of the for loop syntax and to set your solver to be non-convex.
I also recommend using anonymous labeling for vars, exp etc so that you can change them as required.
using JuMP
using Gurobi
ZS = Model(Gurobi.Optimizer)
set_optimizer_attribute(ZS, "NonConvex", 2)
P = [[10 -20];
[30 -40]]
xs = #variable(ZS, x[1:2])
ys = #variable(ZS, y[1:2])
my_obj = #objective(ZS, Max, sum(x[i]*P[i,j]*y[j] for i in 1:2, j in 1:2))
con1 = #constraint(ZS, x[1] + x[2] <= 1)
con2 = #constraint(ZS, , y[1] + y[2] <= 1)
optimize!(ZS)
Runtime is pretty dang long though...
Change definitions of variables to be one-dimensional like this:
#variable(ZS, x[1:2])
#variable(ZS, y[1:2])
and all should work as expected.
Alternatively leave x and y two dimensional and redefine your objective and constraints like this:
#objective(ZS, Max, sum(x[1,i]*P[i,j]*y[j,1] for i=1:2 for j=1:2))
#constraint(ZS, con1, x[1,1] + x[1,2] <= 1)
#constraint(ZS, con2, y[1,1] + y[2,1] <= 1)
As a side note you can define P more simply like this:
julia> P = [10 -20
30 -40]
2×2 Array{Int64,2}:
10 -20
30 -40