It is easy to save an array as files (such as .txt/.csv formats) in Julia/Python, but is there any way to save a function generated from interpolating an array? Taking a simple example:
using Interpolations
inter = Dict("constant" => BSpline(Constant()),
"linear" => BSpline(Linear()),
"quadratic" => BSpline(Quadratic(Line(OnCell()))),
"cubic" => BSpline(Cubic(Line(OnCell())))
)
arr = rand(100, 100, 100) # 3D array
func = interpolate(arr, inter["cubic"])
How can this function be saved for future use, such that one does not need to interpolate the function again each time when one runs the program?
A simple solution is to use JLD2.
using JLD2
#save "savedfunction.jld" func
And then reload with
using Interpolations, JLD2
#load "savedfunction.jld"
func
Related
I have a script written in Lua 5.1 that imports third-party module and calls some functions from it. I would like to get a list of function calls from a module with their arguments (when they are known before execution).
So, I need to write another script which takes the source code of my first script, parses it, and extracts information from its code.
Consider the minimal example.
I have the following module:
local mod = {}
function mod.foo(a, ...)
print(a, ...)
end
return mod
And the following driver code:
local M = require "mod"
M.foo('a', 1)
M.foo('b')
What is the better way to retrieve the data with the "use" occurrences of the M.foo function?
Ideally, I would like to get the information with the name of the function being called and the values of its arguments. From the example code above, it would be enough to get the mapping like this: {'foo': [('a', 1), ('b')]}.
I'm not sure if Lua has functions for reflection to retrieve this information. So probably I'll need to use one of the existing parsers for Lua to get the complete AST and find the function calls I'm interested in.
Any other suggestions?
If you can not modify the files, you can read the files into a strings then parse mod file and find all functions in it, then use that information to parse the target file for all uses of the mod library
functions = {}
for func in modFile:gmatch("function mod%.(%w+)") do
functions[func] = {}
end
for func, call in targetFile:gmatch("M%.(%w+)%(([^%)]+)%)") do
args = {}
for arg in string.gmatch(call, "([^,]+)") do
table.insert(args, arg)
end
table.insert(functions[func], args)
end
Resulting table can then be serialized
['foo'] = {{"'a'", " 1"}, {"'b'"}}
3 possible gotchas:
M is not a very unique name and could vary possibly match unintended function calls to another library.
This example does not handle if there is a function call made inside the arg list. e.g. myfunc(getStuff(), true)
The resulting table does not know the typing of the args so they are all save as strings representations.
If modifying the target file is an option you can create a wrapper around your required module
function log(mod)
local calls = {}
local wrapper = {
__index = function(_, k)
if mod[k] then
return function(...)
calls[k] = calls[k] or {}
table.insert(calls[k], {...})
return mod[k](...)
end
end
end,
}
return setmetatable({},wrapper), calls
end
then you use this function like so.
local M, calls = log(require("mod"))
M.foo('a', 1)
M.foo('b')
If your module is not just functions you would need to handle that in the wrapper, this wrapper assumes all indexes are a function.
after all your calls you can serialize the calls table to get the history of all the calls made. For the example code the table looks like
{
['foo'] = {{'a', 1}, {'b'}}
}
I am still new to functional programming and have been trying to learn how to use transducers. I thought I had a good use case but every time I attempt to write a transducer with Ramda for it, I get the following error:
reduce: list must be array or iterable
I have tried rewriting it several ways and looked at several explanations on the web of transduction but to no avail. Any suggestions?
const data = [{cost:2,quantity:3},{cost:4,quantity:5},{cost:1,quantity:1}];
const transducer = R.compose(R.map(R.product), R.map(R.props(['cost', 'quantity'])));
const result = R.transduce(transducer, R.add, 0)(data);
console.log(result)
In the context of a transducer, compose reads left to right. You just need to invert product and props:
const data = [
{cost:2,quantity:3},
{cost:4,quantity:5},
{cost:1,quantity:1}];
const transducer =
compose(
map(props(['cost', 'quantity'])),
map(product));
console.log(
transduce(transducer, add, 0, data)
)
<script src="https://cdnjs.cloudflare.com/ajax/libs/ramda/0.26.1/ramda.min.js"></script>
<script>const {compose, map, props, product, transduce, add} = R;</script>
The reason why the order reverses is that transducers utilize a property of function composition that is sometimes called abstraction from arity. It simply means that a function composition can return, well, another function:
const comp = f => g => x => f(g(x));
const mapTrace = tag => f => (console.log(tag), xs => (console.log(tag), xs.map(f)));
const sqr = x => x * x;
const main = comp(mapTrace("a")) (mapTrace("b")) (sqr); // returns another function
console.log(main); // logs the 2nd map and then the 1st one (normal order)
// pass an additional argument to that function
console.log(
main([[1,2,3]])); // logs in reverse order
Why returns the composition another function? Because map is a binary function that expects a function argument as its first argument. So when the composition is evaluated it yields another compositon of two partially applied maps. It is this additional iteration that reverses the order. I stop at this point without illustrating the evaluation steps, because I think it would get too complicated otherwise.
Additionally, you can see now how transducers fuse two iterations together: They simply use function composition. Can you do this by hand? Yes, you can absolutely do that.
I wanted to make a function that looks at every column of a DataFrame and return a boolean, so I end up with an array of booleans. Here is the code
# some random dataframe
df = DataFrame([1:3, 4:6])
# a function that returns an array of boolean
function some_bool_fn(df)::Array{Bool}
array_of_arrays = colwise(df) do sdd3
# for illustration only
return true
end
array = [a[1] for a in array_of_arrays]
return array
end
# calling the function
some_bool_fn(dd3)
This works except I find the line
array = [a[1] for a in array_of_arrays]
a bit wasteful. Basically I get an array of arrays as the output of colwise, so I then had to put the array of arrays into a simple array of bools. Is there a way to write the code so I can avoid this line of code?
As #Gnimuc commented this behaviour is changing.
If you look at master branch: https://github.com/JuliaData/DataFrames.jl/blob/master/src/groupeddataframe/grouping.jl#L241 you'll see another version. You could probably copy it:
mycolwise(f, d::AbstractDataFrame) = [f(d[i]) for i in 1:ncol(d)]
Let's say there is a type
immutable Foo
x :: Int64
y :: Float64
end
and there is a variable foo = Foo(1,2.0). I want to construct a new variable bar using foo as a prototype with field y = 3.0 (or, alternatively non-destructively update foo producing a new Foo object). In ML languages (Haskell, OCaml, F#) and a few others (e.g. Clojure) there is an idiom that in pseudo-code would look like
bar = {foo with y = 3.0}
Is there something like this in Julia?
This is tricky. In Clojure this would work with a data structure, a dynamically typed immutable map, so we simply call the appropriate method to add/change a key. But when working with types we'll have to do some reflection to generate an appropriate new constructor for the type. Moreover, unlike Haskell or the various MLs, Julia isn't statically typed, so one does not simply look at an expression like {foo with y = 1} and work out what code should be generated to implement it.
Actually, we can build a Clojure-esque solution to this; since Julia provides enough reflection and dynamism that we can treat the type as a sort of immutable map. We can use fieldnames to get the list of "keys" in order (like [:x, :y]) and we can then use getfield(foo, :x) to get field values dynamically:
immutable Foo
x
y
z
end
x = Foo(1,2,3)
with_slow(x, p) =
typeof(x)(((f == p.first ? p.second : getfield(x, f)) for f in fieldnames(x))...)
with_slow(x, ps...) = reduce(with_slow, x, ps)
with_slow(x, :y => 4, :z => 6) == Foo(1,4,6)
However, there's a reason this is called with_slow. Because of the reflection it's going to be nowhere near as fast as a handwritten function like withy(foo::Foo, y) = Foo(foo.x, y, foo.z). If Foo is parametised (e.g. Foo{T} with y::T) then Julia will be able to infer that withy(foo, 1.) returns a Foo{Float64}, but won't be able to infer with_slow at all. As we know, this kills the crab performance.
The only way to make this as fast as ML and co is to generate code effectively equivalent to the handwritten version. As it happens, we can pull off that version as well!
# Fields
type Field{K} end
Base.convert{K}(::Type{Symbol}, ::Field{K}) = K
Base.convert(::Type{Field}, s::Symbol) = Field{s}()
macro f_str(s)
:(Field{$(Expr(:quote, symbol(s)))}())
end
typealias FieldPair{F<:Field, T} Pair{F, T}
# Immutable `with`
for nargs = 1:5
args = [symbol("p$i") for i = 1:nargs]
#eval with(x, $([:($p::FieldPair) for p = args]...), p::FieldPair) =
with(with(x, $(args...)), p)
end
#generated function with{F, T}(x, p::Pair{Field{F}, T})
:($(x.name.primary)($([name == F ? :(p.second) : :(x.$name)
for name in fieldnames(x)]...)))
end
The first section is a hack to produce a symbol-like object, f"foo", whose value is known within the type system. The generated function is like a macro that takes types as opposed to expressions; because it has access to Foo and the field names it can generate essentially the hand-optimised version of this code. You can also check that Julia is able to properly infer the output type, if you parametrise Foo:
#code_typed with(x, f"y" => 4., f"z" => "hello") # => ...::Foo{Int,Float64,String}
(The for nargs line is essentially a manually-unrolled reduce which enables this.)
Finally, lest I be accused of giving slightly crazy advice, I want to warn that this isn't all that idiomatic in Julia. While I can't give very specific advice without knowing your use case, it's generally best to have fields with a manageable (small) set of fields and a small set of functions which do the basic manipulation of those fields; you can build on those functions to create the final public API. If what you want is really an immutable dict, you're much better off just using a specialised data structure for that.
There is also setindex (without the ! at the end) implemented in the FixedSizeArrays.jl package, which does this in an efficient way.
Is there a way, using the SML Basis library, to open a file at a specific position? That is, use an operating system call to change the position, rather than scan through the file and throw away the data.
This is tricky. Unfortunately, seeking isn't directly supported. Moreover, file positions are only transparent for binary files, i.e., those that you have opened with the BinIO structure [1]. For this structure, the corresponding type BinIO.StreamIO.pos is defined to be Position.int, which is some integer type.
However, in an SML system that supports the complete I/O stack from the standard you should be able to synthesise the following seek function using the lower I/O layers:
(* seekIn : BinIO.instream * Position.int -> unit *)
fun seekIn(instream, pos) =
case BinIO.StreamIO.getReader(BinIO.getInstream instream) of
(reader as BinPrimIO.RD{setPos = SOME f, ...}, _) =>
( f pos;
BinIO.setInstream(instream,
BinIO.StreamIO.mkInstream(reader, Word8Vector.fromList[]))
)
| (BinPrimIO.RD{name, ...}, _) =>
raise IO.Io{
name = name,
function = "seekIn",
cause = IO.RandomAccessNotSupported
}
Use it like:
val file = BinIO.openIn "filename"
val _ = seekIn(file, 200)
val bin = BinIO.inputN(file, 1000)
If you need to convert from Word8Vector to string:
val s = Byte.bytesToString bin
You can do the equivalent for out streams as well.
[1] http://standardml.org/Basis/bin-io.html#BIN_IO:SIG:SPEC
If you can manage to get hold of the reader/writer, then they should have getPos, setPos and endPos functions, depending on which kind of reader/writer you are dealing with.