What is the proper way to organize a Julia source tree? - julia

I am trying to figure the proper way to organize the source tree for a Julia application seqscan. For now I have the following tree:
$ tree seqscan/
seqscan/
├── LICENSE
├── README.md
├── benchmark
├── doc
├── examples
├── src
│   └── seq.jl
└── test
└── test_seq.jl
5 directories, 4 files
The file seq.jl contains
module SeqScan
module Seq
export SeqEntry
type SeqEntry
id
seq
scores
seq_type
end
end
end
and test_seq.jl contains:
module TestSeq
using Base.Test
using SeqScan.Seq
#testset "Testing SeqEntry" begin
#testset "test SeqEntry creation" begin
seq_entry = SeqEntry("test", "atcg")
#test seq_entry.id == "test"
#test seq_entry.seq == "atcg"
end
end
end
However, running the test code yields an error:
ERROR: LoadError: ArgumentError: Module SeqScan not found in current path.
even after setting the JULIA_LOAD_PATH environment variable to include seqscan or seqscan/src, so I must be doing something wrong?

The name of your package (the root of your local tree) needs to match the name of a file that exists under the src directory. Try this:
SeqScan/
|-- src/
|-- SeqScan.jl (your seq.jl)
I don't know why you are enclosing the module Seq in SeqScan. If there is no important reason to do that, you could access the type more directly. You could remove "module Seq" and the paired "end". Then just "using SeqScan" would bring in the type SeqEntry.
The type, SeqEntry, as written knows what to do when given four field values, one for each of the defined fields. If you want to initialize that type with just the first two fields, you need to include a two-argument constructor. For example, assuming seq is a vector of some numeric type and scores is also a vector of that numeric type and and seq_type is a numeric type:
function SeqEntry(id, seq)
seq_type = typeof(seq[1])
scores = zeros(seq_type, length(seq))
return SeqEntry(id, seq, scores, seq_type)
end
An example of a package with internal modules, for Julia v0.5.
The package is named MyPackage.jl; it incorporates two internal modules: TypeModule and FuncModule; each module has its own file: TypeModule.jl and FuncModule.jl.
TypeModule contains a new type, MyType. FuncModule contains a new function, MyFunc, which operates on variable[s] of MyType. There are two forms of that function, a 1-arg and a 2-arg version.
MyPackage uses both internal modules. It incorporates each for immediate use and initializes two variables of MyType. Then MyPackage applies MyFunc to them and prints the results.
I assume Julia's package directory is "/you/.julia/v0.5" (Windows: "c:\you.julia\v0.5"), and refer to it as PkgDir. You can find the real package directory by typing Pkg.dir() at Julia's interactive prompt. The first thing to do make sure Julia's internal information is current: > Pkg.update() and then get a special package call PkgDev: > Pkg.add("PkgDev")
You might start your package on GitHub. If you are starting it locally, you should use PkgDev because it creates the essential package file (and others) using the right structure:
> using PkgDev then > PkgDev.generate("MyPackage","MIT")
This also creates a file, LICENSE.md, with Julia's go-to license. You can keep it, replace it or remove it.
In the directory PkgDir/MyPackage/src, create a subdirectory "internal". In the directory PkgDir/MyPackage/src/internal, create two files: "TypeModule.jl" and "FuncModule.jl", these:
TypeModule.jl:
module TypeModule
export MyType
type MyType
value::Int
end
end # TypeModule
FuncModule.jl:
module FuncModule
export MyFunc
#=
!important!
TypeModule is included in MyPackage.jl before this module
This module gets MyType from MyPackage.jl, not directly.
Getting it directly would create mismatch of module indirection.
=#
import ..MyPackage: MyType
function MyFunc(x::MyType)
return x.value + 1
end
function MyFunc(x::MyType, y::MyType)
return x.value + y.value + 1
end
end # FuncModule
And in the src directory, edit MyPackage.jl so it matches this:
MyPackage.jl:
module MyPackage
export MyType, MyFunc
#=
!important! Do this before including FuncModule
because FuncModule.jl imports MyType from here.
MyType must be in use before including FuncModule.
=#
include( joinpath("internal", "TypeModule.jl") )
using .TypeModule # prefix the '.' to use an included module
include( joinpath("internal", "FuncModule.jl") )
using .FuncModule # prefix the '.' to use an included module
three = MyType(3)
five = MyType(5)
four = MyFunc(three)
eight = MyFunc(three, five)
# show that everything works
println()
println( string("MyFunc(three) = ", four) )
println( string("MyFunc(three, five) = ", eight) )
end # MyPackage
Now, running Julia entering > using MyPackage should show this:
julia> using MyPackage
4 = MyFunc(three)
9 = MyFunc(three, five)
julia>

Related

Self-referential values in an R config file

Using the config package, I'd like elements to reference other elements,
like how path_file_a references path_directory.
config.yml file in the working directory:
default:
path_directory : "data-public"
path_file_a : "{path_directory}/a.csv"
path_file_b : "{path_directory}/b.csv"
path_file_c : "{path_directory}/c.csv"
# recursive : !expr file.path(config::get("path_directory"), "c.csv")
Code:
config <- config::get()
config$path_file_a
# Returns: "{path_directory}/a.csv"
glue::glue(config$path_file_a, .envir = config)
# Returns: "data-public/a.csv"
I can use something like glue::glue() on the value returned by config$path_file_a.
But I'd prefer to have the value already substituted so config$path_file_a contains the actual value (not the template for the value).
As you might expect, uncommenting the recursive line creates an endless self-referential loop.
Are there better alternatives to glue::glue(config$path_file_a, .envir = config)?
I came across the same problem and I've written a wrapper around config and glue.
The package is called gonfig and has been submitted to CRAN.
With it you would have:
config.yml
default:
path_directory : "data-public"
path_file_a : "{path_directory}/a.csv"
path_file_b : "{path_directory}/b.csv"
path_file_c : "{path_directory}/c.csv"
And in your R script:
config <- gonfig::get()
config$path_file_c
#> "data-public/c.csv"

Call Java from R

I want to execute Java code from R. I used rJava package and I was able to execute a simple code of Java such as create object or print on screen.
require("rJava")
.jinit()
test<-new (J ("java.lang.String") , "Hello World!")
However what I want to do is to send a dataframe from R or CSV file and execute a code in Java then return the output file to R. At the same time, it is difficult in my case to call the R code from Java, as I want to process the CVS file first in R , then apply the Java code on it and return the result again to R to complete the analysis.
I'd go following way here.
Process CSV file inside R
Save this file somewhere and make sure you know explicit location (e.g. /home/user/some_csv_file.csv)
Create adapter class in Java that will have method String processFile(String file)
Inside method processFile read the file, pass it to your code in Java and do Java based processing
Store output file somewhere and return it's location
Inside R, get the result of processFile method and do further processing in R
At least, that's what I'd do as a first draft of a solution for your problem.
Update
We need Java file
// sample/Adapter.java
package sample;
public class Adapter {
public String processFile(String file) {
System.out.println("I am processing file: " + file);
return "new_file_location.csv";
}
public static void main(String [] arg) {
Adapter adp = new Adapter();
System.out.println("Result: " + adp.processFile("initial_file.csv"));
}
}
We have to compile it
> mkdir target
> javac -d target sample/Adapter.java
> java -cp target sample.Adapter
I am processing file: initial_file.csv
Result: new_file_location.csv
> export CLASSPATH=`pwd`/target
> R
We have to call it from R
> library(rJava)
> .jinit()
> obj <- .jnew("sample.Adapter")
> s <- .jcall(obj, returnSig="Ljava/lang/String;", method="processFile", 'initial_file')
> s
I am processing file: initial_file
> s
[1] "new_file_location.csv"
And your source directory looks like this
.
├── sample
│   └──Adapter.java
└── target
     └── sample
         └── Adapter.class
In processFile you can do whatever you like and call your existing Java code.

Inno Setup Choose a directory to install files from a pre-defined set

In this situation, I need to install a file to specific directory, but in different computer it might be in different folder so I need to check which on is correct.
For example, I have a file and it needs to install in A folder or B folder or C folder, depends on the computer has A or B or C. So I need to check them first, if the computer has B, then install the file in the B folder, etc.
I know I can use check after file's DestDir, if the directory doesn't exist then it won't install anything, but what I need is install that file to other directory.
Thanks in advance.
In the InitializeSetup event function, check for existence of your pre-defined set of directories and remember the one you find. Then set the default installation path to the found one using a scripted constant in the DefaultDirName directive.
You will possible also want to set the DisableDirPage=yes and the UsePreviousAppDir=no.
[Setup]
DefaultDirName={code:GetDirName}
DisableDirPage=yes
UsePreviousAppDir=no
[Files]
Source: "MyProg.exe"; DestDir: "{app}"
Source: "MyProg.chm"; DestDir: "{app}"
[Code]
var
DirName: string;
function TryPath(Path: string): Boolean;
begin
Result := DirExists(Path);
if Result then
begin
Log(Format('Path %s exists', [Path]))
DirName := Path;
end
else
begin
Log(Format('Path %s does not', [Path]))
end;
end;
function GetDirName(Param: string): string;
begin
Result := DirName;
end;
function InitializeSetup(): Boolean;
begin
Result :=
TryPath('C:\path1') or
TryPath('C:\path2') or
TryPath('C:\path3');
if Result then
begin
Log(Format('Destination %s selected', [DirName]))
end
else
begin
MsgBox('No destination found, aborting installation', mbError, MB_OK);
end;
end;
Instead of using DefaultDirName={code:GetDirName}, you can also use DestDir: "{code:GetDirName}" in the respective entries of the [Files] section, if appropriate.

Premake doesn't picking up dependencies

Recently I've changed from CMake to Premake (v5.0.0-alpha8) and I'm not quite sure how to achieve the the following in Premake.
I want to include some dependencies so in CMake I can do something like this:
target_link_libraries(${PROJECT_NAME}
${YALLA_ABS_PLATFORM}
${YALLA_LIBRARY})
The above will add the paths of these libraries (dir) to "Additional Include Directories" in the compiler and it will also add an entry (lib) to "Additional Dependencies" in the linker so I don't need to do anything special beyond calling target_link_libraries.
So I expected that when I'm doing something like this in Premake:
links {
YALLA_LIBRARY
}
I'd get the same result but I don't.
I also tried to use the libdirs but it doesn't really work, I mean I can't see the library directory and its subdirectories passed to the compiler as "Additional Include Directories" (/I) or Yalla.Library.lib passed to the the linker as "Additional Dependencies".
Here is the directory structure I use:
.
|-- src
| |-- launcher
| |-- library
| | `-- utils
| `-- platform
| |-- abstract
| `-- win32
`-- tests
`-- platform
`-- win32
The library dir is defined in Premake as follow:
project(YALLA_LIBRARY)
kind "SharedLib"
files {
"utils/string-converter.hpp",
"utils/string-converter.cpp",
"defines.hpp"
}
The platform dir is defined in Premake as follow:
project(YALLA_PLATFORM)
kind "SharedLib"
includedirs "abstract"
links {
YALLA_LIBRARY
}
if os.get() == "windows" then
include "win32"
else
return -- OS NOT SUPPORTED
end
The win32 dir is defined in Premake as follow:
files {
"event-loop.cpp",
"win32-exception.cpp",
"win32-exception.hpp",
"win32-window.cpp",
"win32-window.hpp",
"window.cpp"
}
And finally at the root dir I have the following Premake file:
PROJECT_NAME = "Yalla"
-- Sets global constants that represents the projects' names
YALLA_LAUNCHER = PROJECT_NAME .. ".Launcher"
YALLA_LIBRARY = PROJECT_NAME .. ".Library"
YALLA_ABS_PLATFORM = PROJECT_NAME .. ".AbstractPlatform"
YALLA_PLATFORM = PROJECT_NAME .. ".Platform"
workspace(PROJECT_NAME)
configurations { "Release", "Debug" }
flags { "Unicode" }
startproject ( YALLA_LAUNCHER )
location ( "../lua_build" )
include "src/launcher"
include "src/library"
include "src/platform"
I'm probably misunderstanding how Premake works due to lack of experience with it.
I solved it by creating a new global function and named it includedeps.
function includedeps(workspace, ...)
local workspace = premake.global.getWorkspace(workspace)
local args = { ... }
local args_count = select("#", ...)
local func = select(args_count, ...)
if type(func) == "function" then
args_count = args_count - 1
args = table.remove(args, args_count)
else
func = nil
end
for i = 1, args_count do
local projectName = select(i, ...)
local project = premake.workspace.findproject(workspace, projectName)
if project then
local topIncludeDir, dirs = path.getdirectory(project.script)
if func then
dirs = func(topIncludeDir)
else
dirs = os.matchdirs(topIncludeDir .. "/**")
table.insert(dirs, topIncludeDir)
end
includedirs(dirs)
if premake.project.iscpp(project) then
libdirs(dirs)
end
links(args)
else
error(string.format("project '%s' does not exist.", projectName), 3)
end
end
end
Usage:
includedeps(PROJECT_NAME, YALLA_LIBRARY)
or
includedeps(PROJECT_NAME, YALLA_PLATFORM, function(topIncludeDir)
return { path.join(topIncludeDir, "win32") }
end)
Update:
For this to work properly you need to make sure that when you include the dependencies they are included by their dependency order and not by the order of the directory structure.
So for example if I have the following dependency graph launcher --> platform --> library then I'll have to include them in the following order.
include "src/library"
include "src/platform"
include "src/launcher"
As opposed to the directory structure that in my case is as follow:
src/launcher
src/library
src/platform
If you will include them by their directory structure it will fail and tell you that "The project 'Yalla.Platform' does not exist."

Include a simple val in sbt build files from global.sbt

I wish to set my version numbers externally across several build.sbt files through a single include file.
Within build.sbt I can do this
val base = "1.1"
version := base + ".8-SNAPSHOT"
This works fine as a first step.
According the the online help I should be able to create a file global.sbt in my ~/.sbt/0.13 folder
I created the file global.sbt with single line
val base = "1.1"
and removed the corresponding line from build.sbt
But when I start up my sbt I get "error: not found: value base"
So either it's not finding the global sbt or this form of global setting doesn't work.
Any suggestions as to how I can resolve this?
Can I make an explicit include command in my build.sbt files?
It seems from your test that vals in global ~/.sbt/0.13/*.sbt files don't propagate to local *.sbt files.
Here's a setup that works:
~/.sbt/0.13/plugins/VersionBasePlugin.scala
import sbt._, Keys._
object VersionBasePlugin extends AutoPlugin {
override def requires = plugins.CorePlugin
override def trigger = allRequirements
object autoImport {
val versionBase = settingKey[String]("version base")
}
import autoImport._
override def projectSettings = Seq(versionBase := "1.1")
}
and then in your build.sbt:
version := (versionBase.value + ".8-SNAPSHOT")
Does that work for you?

Resources