lua oop deep copy a table - recursion

My deep copy code:
function deepcopy(orig)
local orig_type = type(orig)
local copy
if orig_type == 'table' then
copy = {}
for orig_key, orig_value in next, orig, nil do
copy[deepcopy(orig_key)] = deepcopy(orig_value)
end
setmetatable(copy, deepcopy(getmetatable(orig)))
else -- number, string, boolean, etc
copy = orig
end
return copy
end
I'm trying to implement this to oop using self, but couldn't get it to work, here is what I've tried so far
function block:deepcopy()
local orig_type = type(self)
local copy
if orig_type == 'table' then
copy = {}
for orig_key, orig_value in next, self, nil do
copy[self:deepcopy(orig_key)] = deepcopy(orig_value)
end
setmetatable(copy, self:deeepcopy(getmetatable(self)))
else
copy = orig
end
return copy

In the OOP version of the function, self:deepcopy(something) with the method syntax (colon) doesn't do what you want it to. It is equivalent to self.deepcopy(self, something); the second argument something is ignored and you just end up trying to re-copy the same self over and over until there's a stack overflow. You have to do self.deepcopy(something) with a dot to pass something as the self argument (the argument that is copied).
Calling self.deepcopy inside the definition of the deepcopy method assumes that every subtable has a self.deepcopy function. If not, you will get an "attempt to call a nil value" error. But you could do this if you want every subtable to have its own version of deepcopy that is used when copying the immediate children of that table (keys, values, metatable). For instance, you could have a subtable whose deepcopy method does not copy the metatable. Here is the basic version where the subtable has the same deepcopy method:
local block = {}
function block:deepcopy()
if type(self) == 'table' then
local copy = {}
for key, value in pairs(self) do
copy[self.deepcopy(key)] = self.deepcopy(value)
end
return setmetatable(copy, self.deepcopy(getmetatable(self)))
else
return self
end
end
block.a = { a = 10, deepcopy = block.deepcopy }
block:deepcopy() -- works
block.a = { a = 10 }
block:deepcopy() -- error: "attempt to call a nil value (field 'deepcopy')"
But you don't need to rewrite the function at all to use it in object-oriented style. Try using your first definition of deepcopy. Do object.deepcopy = deepcopy, and then call object:deepcopy() and you will get a copy of the object.

Related

How to return a single element from a Vec from a function?

I'm new to Rust, and I'm trying to make an interface where the user can choose a file by typing the filename from a list of available files.
This function is supposed to return the DirEntry corresponding to the chosen file:
fn ask_user_to_pick_file(available_files: Vec<DirEntry>) -> DirEntry {
println!("Which month would you like to sum?");
print_file_names(&available_files);
let input = read_line_from_stdin();
let chosen = available_files.iter()
.find(|dir_entry| dir_entry.file_name().into_string().unwrap() == input )
.expect("didnt match any files");
return chosen
}
However, it appears chosen is somehow borrowed here? I get the following error:
35 | return chosen
| ^^^^^^ expected struct `DirEntry`, found `&DirEntry`
Is there a way I can "unborrow" it? Or do I have to implement the Copy trait for DirEntry?
If it matters I don't care about theVec after this method, so if "unborrowing" chosen destroys the Vec, thats okay by me (as long as the compiler agrees).
Use into_iter() instead of iter() so you get owned values instead of references out of the iterator. After that change the code will compile and work as expected:
fn ask_user_to_pick_file(available_files: Vec<DirEntry>) -> DirEntry {
println!("Which month would you like to sum?");
print_file_names(&available_files);
let input = read_line_from_stdin();
let chosen = available_files
.into_iter() // changed from iter() to into_iter() here
.find(|dir_entry| dir_entry.file_name().into_string().unwrap() == input)
.expect("didnt match any files");
chosen
}

in Delphi 5, is a TList parameter always passed by reference?

I am migrating some code from Delphi 5 to a modern platform. Currently I have the compiled code (which works in my environment) and the source code (which cannot be compiled in my environment). This means I can't really experiment with the code by changing it or inserting breakpoints or dumping values. In looking at one particular passage of code, I see that one Procedure (ProcedureA) is calling another (ProcedureB) and passing in parameters that must be by reference, since otherwise ProcedureB would have no effect. It's my understanding that a var prefix must be added to parameters in a Procedure's parameter list in order for them to be passed by reference, but this is not being done here. One of the parameters, though, is of type TList, which I know to be essentially an array of pointers. My question is: are parameters of type TList (as well as others having to do with pointers) implicitly passed by reference?
Here's the code:
Procedure ProcedureB(PartyHeaderInformationPtr : PartyHeaderInformationPointer;
PartyHeaderTable : TTable;
_PrisonCode : String;
_FineType : TFineTypes;
PartyHeaderInformationList : TList);
begin
with PartyHeaderInformationPtr^, PartyHeaderTable do
begin
AssessmentYear := FieldByName('TaxRollYr').Text;
PartyType := FieldByName('PartyType').Text;
PartyNumber := FieldByName('PartyNo').AsInteger;
PrisonCode := _PrisonCode;
FineType := _FineType;
end; {with PartyHeaderInformationPtr^ ...}
PartyHeaderInformationList.Add(PartyHeaderInformationPtr);
end; {AddPartyHeaderPointerInformation}
{=================================================================}
Procedure ProcedureA(PartyHeaderTable : TTable;
PartyDetailTable : TTable;
PartyHeaderInformationList : TList);
var
Done, FirstTimeThrough : Boolean;
PrisonPartyFound, JunglePartyFound : Boolean;
PrisonPartyYear, PrisonCode, PartyType : String;
PartyHeaderInformationPtr : PartyHeaderInformationPointer;
begin
PartyHeaderTable.Last;
PrisonPartyYear := '';
PrisonPartyFound := False;
JunglePartyFound := False;
Done := False;
FirstTimeThrough := True;
repeat
If FirstTimeThrough
then FirstTimeThrough := False
else PartyHeaderTable.Prior;
If PartyHeaderTable.BOF
then Done := True;
If not Done
then
begin
PartyType := PartyHeaderTable.FieldByName('PartyType').Text;
If ((not JunglePartyFound) and
((PartyType = 'MU') or
(PartyType = 'TO')))
then
begin
JunglePartyFound := True;
New(PartyHeaderInformationPtr);
AddPartyHeaderPointerInformation(PartyHeaderInformationPtr,
PartyHeaderTable,
'', ltPlace,
PartyHeaderInformationList);
end; {If ((not JunglePartyFound) and ...}
end; {If not Done}
until Done;
end; {FillPartyHeaderInformationList}
Yes.
In Delphi, classes are reference types.
Every variable of type TBitmap, TList, TButton, TStringList, TForm etc. is nothing but a pointer to the object, so an object is always passed "by reference". It is only this address, this native-sized integer, that is given to the called routine.
Consequently, even without var, the called routine can alter the object since it, like the caller, has the address to it. But the pointer itself is passed by value, so if the called routine alters the parameter pointer to point to a different object, the caller will not see that; only the called routine's copy of the address is changed. With var, the pointer itself is passed by reference, so the called routine can change that too: it can change the original object, and it can make the caller's variable point to a different object, if it wants to.
On the other hand, value types like integers, booleans, sets, static arrays, and records are passed by value, so -- without any parameter decoration such as var -- the called routine gets a copy, and any changes made are only made to that copy. The caller will not see its variable being modified. If you use a var parameter, however, the variable will be passed by reference.
So, in your case, it has nothing to do with TList being a "list" or being something that "contains pointers". It's about TList being a class.

How to extend a Java object in JNLua

I am experimenting with using JNLua's javavm module to connect with and extend a Java library (JAR). So far I am super impressed with how easy it is to pass Java objects back and forth between Lua and Java- seemlessness.
Now I am interested to extend these Java objects in LUA. In my naive approach I've wrapped the Java object in a Lua class with the intent of extending that objects API i.e. adding methods to it. But I don't want to have to recreate, within the wrapper, all of the Java objects methods. It seems like I should be able to inherit from the Java object so that when a method is missing from my wrapper Lua will look for it in the Java object which is a member of the wrapped class. I've tried adapting the examples shown in Inheritance but this is a slightly trickier thing to set up, given that I'm dealing with a Java object. Thoughts?
I found my answer in the below SO question
Add members dynamically to a class using Lua + SWIG
I needed to realize I was dealing with a UserData object, not a table- no way to add members
I needed some metatable kung-fu
The below code has the effect of allowing me to extend (add methods) a Java object.
function Model:new (model)
o = {}
WrapObject(Model, o, model)
self.__index = self
self.model = model or nil
return o
end
function WrapObject(class, object, userData)
local wrapper_metatable = {}
function wrapper_metatable.__index(self, key)
local ret = rawget(class, key)
if(not ret) then
ret = userData[key]
if(type(ret) == "function") then
return function(self, ...)
return ret(userData, ...)
end
else
return ret
end
else
return ret
end
end
setmetatable(object, wrapper_metatable)
return object
end
function Model:Test ()
name = self:GetFullName()
fileName = self:GetFileName()
ret = name .. fileName
print("It's a test!!")
return ret
end

How to implement a Binary Search Tree in Julia?

I am trying to implement BST in Julia, but I encountered problem when I call insert function. When I try to create new node, the structure stays unchanged.
My code:
type Node
key::Int64
left
right
end
function insert(key::Int64, node)
if node == 0
node = Node(key, 0, 0)
elseif key < node.key
insert(key, node.left)
elseif key > node.key
insert(key, node.right)
end
end
root = Node(0,0,0)
insert(1,root)
insert(2,root)
I also tried to change zero to nothing. Next version which I tried is with defined datatypes in Node, but when I try to call insert with nothing value(similar to C Null) it gave me error.
Thank for answer.
The code has a number of issues.
One is that it is not type stable, left and right could contain anything.
The other is that setting the local variable node inside the insert function will not affect change anything.
A stylistic issue, functions that modify their arguments generally have a ! as the last character in the name, such as insert!, push! setindex!.
I think the following should work:
type BST
key::Int
left::Nullable{BST}
right::Nullable{BST}
end
BST(key::Int) = BST(key, Nullable{BST}(), Nullable{BST}())
BST() = BST(0)
function Base.push!(node::BST, key)
if key < node.key
if node.left.isnull
node.left = BST(key)
else
push!(node.left.value, key)
end
elseif key > node.key
if node.right.isnull
node.right = BST(key)
else
push!(node.right.value, key)
end
end
end
root = BST()
push!(root, 1)
push!(root, 2)
This version overloads the Julia push! function, and uses the Nullable type so that it can distinguish between a leaf node and where it needs to continue searching to find where to insert the key. This is a recursive definition, and is not optimized, it would be much faster to do it with loops instead of recursive.

Creating Sequence of Sequences is Causing a StackOverflowException

I'm trying to take a large file and split it into many smaller files. The location where each split occurs is based on a predicate returned from examining the contents of each given line (isNextObject function).
I have attempted to read in the large file via the File.ReadLines function so that I can iterate through the file one line at a time without having to hold the entire file in memory. My approach was to group the sequence into a sequence of smaller sub-sequences (one per file to be written out).
I found a useful function that Tomas Petricek created on fssnip called groupWhen. This function worked great for my initial testing on a small subset of the file, but a StackoverflowException is thrown when using the real file. I am not sure how to adjust the groupWhen function to prevent this (I'm still an F# greenie).
Here is a simplified version of the code showing only the relevant parts that will recreate the StackoverflowExcpetion::
// This is the function created by Tomas Petricek where the StackoverflowExcpetion is occuring
module Seq =
/// Iterates over elements of the input sequence and groups adjacent elements.
/// A new group is started when the specified predicate holds about the element
/// of the sequence (and at the beginning of the iteration).
///
/// For example:
/// Seq.groupWhen isOdd [3;3;2;4;1;2] = seq [[3]; [3; 2; 4]; [1; 2]]
let groupWhen f (input:seq<_>) = seq {
use en = input.GetEnumerator()
let running = ref true
// Generate a group starting with the current element. Stops generating
// when it founds element such that 'f en.Current' is 'true'
let rec group() =
[ yield en.Current
if en.MoveNext() then
if not (f en.Current) then yield! group() // *** Exception occurs here ***
else running := false ]
if en.MoveNext() then
// While there are still elements, start a new group
while running.Value do
yield group() |> Seq.ofList }
This is the gist of the code making use Tomas' function:
module Extractor =
open System
open System.IO
open Microsoft.FSharp.Reflection
// ... elided a few functions include "isNextObject" which is
// a string -> bool (examines the line and returns true
// if the string meets the criteria to that we are at the
// start of the next inner file)
let writeFile outputDir file =
// ... write out "file" to the file system
// NOTE: file is a seq<string>
let writeFiles outputDir (files : seq<seq<_>>) =
files
|> Seq.iter (fun file -> writeFile outputDir file)
And here is the relevant code in the console application that makes use of the functions:
let lines = inputFile |> File.ReadLines
writeFiles outputDir (lines |> Seq.groupWhen isNextObject)
Any ideas on the proper way to stop groupWhen from blowing the stack? I'm not sure how I would convert the function to use an accumulator (or to use a continuation instead, which I think is the correct terminology).
The problem with this is that the group() function returns a list, which is an eagerly evaluated data structure, which means that every time you call group() it has to run to the end, collect all results in a list, and return the list. This means that the recursive call happens within that same evaluation - i.e. truly recursively, - thus creating stack pressure.
To mitigate this problem, you could just replace the list with a lazy sequence:
let rec group() = seq {
yield en.Current
if en.MoveNext() then
if not (f en.Current) then yield! group()
else running := false }
However, I would consider less drastic approaches. This example is a good illustration of why you should avoid doing recursion yourself and resort to ready-made folds instead.
For example, judging by your description, it seems that Seq.windowed may work for you.
It's easy to overuse sequences in F#, IMO. You can accidentally get stack overflows, plus they are slow.
So (not actually answering your question),
personally I would just fold over the seq of lines using something like this:
let isNextObject line =
line = "---"
type State = {
fileIndex : int
filename: string
writer: System.IO.TextWriter
}
let makeFilename index =
sprintf "File%i" index
let closeFile (state:State) =
//state.writer.Close() // would use this in real code
state.writer.WriteLine("=== Closing {0} ===",state.filename)
let createFile index =
let newFilename = makeFilename index
let newWriter = System.Console.Out // dummy
newWriter.WriteLine("=== Creating {0} ===",newFilename)
// create new state with new writer
{fileIndex=index + 1; writer = newWriter; filename=newFilename }
let writeLine (state:State) line =
if isNextObject line then
/// finish old file here
closeFile state
/// create new file here and return updated state
createFile state.fileIndex
else
//write the line to the current file
state.writer.WriteLine(line)
// return the unchanged state
state
let processLines (lines: string seq) =
//setup
let initialState = createFile 1
// process the file
let finalState = lines |> Seq.fold writeLine initialState
// tidy up
closeFile finalState
(Obviously a real version would use files rather than the console)
Yes, it is crude, but it is easy to reason about, with
no unpleasant surprises.
Here's a test:
processLines [
"a"; "b"
"---";"c"; "d"
"---";"e"; "f"
]
And here's what the output looks like:
=== Creating File1 ===
a
b
=== Closing File1 ===
=== Creating File2 ===
c
d
=== Closing File2 ===
=== Creating File3 ===
e
f
=== Closing File3 ===

Resources