JuliaLang: ERROR: LoadError: IOError: open(“..) - julia

I am trying to generating executable. So, I created Net module inside .Net/src/Net.jl. However, when I run packagecompiler.jl (below), it gives an error. How can I solve the problem?
module Net
using NativeFileDialog, JLD2, Parameters
Base.#ccallable function julia_main()::Cint
try
netlisttranslator()
catch
Base.invokelatest(Base.display_error, Base.catch_stack())
return 1
end
return 0
end
#=
Here all sub-functions called by netlisttranslator()
netlisttranslator()
=#
packagecompiler is:
using PackageCompiler: PackageCompiler, create_sysimage, create_app, create_library
using Pkg
ENV["JULIA_DEBUG"] = "PackageCompiler"
Pkg.activate("$(#__DIR__)")
Pkg.add("PackageCompiler")
Pkg.add("NativeFileDialog")
Pkg.add("JLD2")
Pkg.add("Parameters")
Pkg.resolve()
Pkg.instantiate(; verbose = false)
tmp_app_source_dir = "$(#__DIR__)"
app_compiled_dir = "$(#__DIR__)\\generated"
create_app(tmp_app_source_dir, app_compiled_dir; incremental=false, force=true,
executables=["Net" => "julia_main"
])
Activating environment at `c:\Users\amroa\OneDrive - polymtl.ca\AmroAlsabbagh\Code\Julia\Tests and Notes\12- Create exe file\NetlistEMTP\Net\Project.toml`
Updating registry at `C:\Users\amroa\.julia\registries\General`
Updating git-repo `https://github.com/JuliaRegistries/General.git`
Resolving package versions...
No Changes to `C:\Users\amroa\OneDrive - polymtl.ca\AmroAlsabbagh\Code\Julia\Tests and Notes\12- Create exe file\NetlistEMTP\Net\Project.toml`
No Changes to `C:\Users\amroa\OneDrive - polymtl.ca\AmroAlsabbagh\Code\Julia\Tests and Notes\12- Create exe file\NetlistEMTP\Net\Manifest.toml`
Resolving package versions...
No Changes to `C:\Users\amroa\OneDrive - polymtl.ca\AmroAlsabbagh\Code\Julia\Tests and Notes\12- Create exe file\NetlistEMTP\Net\Project.toml`
No Changes to `C:\Users\amroa\OneDrive - polymtl.ca\AmroAlsabbagh\Code\Julia\Tests and Notes\12- Create exe file\NetlistEMTP\Net\Manifest.toml`
Resolving package versions...
No Changes to `C:\Users\amroa\OneDrive - polymtl.ca\AmroAlsabbagh\Code\Julia\Tests and Notes\12- Create exe file\NetlistEMTP\Net\Project.toml`
No Changes to `C:\Users\amroa\OneDrive - polymtl.ca\AmroAlsabbagh\Code\Julia\Tests and Notes\12- Create exe file\NetlistEMTP\Net\Manifest.toml`
Resolving package versions...
No Changes to `C:\Users\amroa\OneDrive - polymtl.ca\AmroAlsabbagh\Code\Julia\Tests and Notes\12- Create exe file\NetlistEMTP\Net\Project.toml`
No Changes to `C:\Users\amroa\OneDrive - polymtl.ca\AmroAlsabbagh\Code\Julia\Tests and Notes\12- Create exe file\NetlistEMTP\Net\Manifest.toml`
No Changes to `C:\Users\amroa\OneDrive - polymtl.ca\AmroAlsabbagh\Code\Julia\Tests and Notes\12- Create exe file\NetlistEMTP\Net\Project.toml`
No Changes to `C:\Users\amroa\OneDrive - polymtl.ca\AmroAlsabbagh\Code\Julia\Tests and Notes\12- Create exe file\NetlistEMTP\Net\Manifest.toml`
PackageCompiler: bundled artifacts:
├── ATK_jll - 2.219 MiB
├── Bzip2_jll - 2.232 MiB
├── Cairo_jll - 13.579 MiB
├── Expat_jll - 1.125 MiB
├── Fontconfig_jll - 3.017 MiB
├── FreeType2_jll - 4.737 MiB
├── FriBidi_jll - 570.671 KiB
├── GTK3_jll - 73.302 MiB
├── Gettext_jll - 19.930 MiB
├── Glib_jll - 20.546 MiB
├── Graphite2_jll - 696.780 KiB
├── HarfBuzz_jll - 6.257 MiB
├── JpegTurbo_jll - 4.951 MiB
├── LERC_jll - 758.367 KiB
├── LZO_jll - 1.084 MiB
├── Libepoxy_jll - 13.043 MiB
├── Libffi_jll - 205.667 KiB
├── Libgcrypt_jll - 6.433 MiB
├── Libgpg_error_jll - 2.227 MiB
├── Libiconv_jll - 2.245 MiB
├── Libtiff_jll - 9.779 MiB
├── NativeFileDialog_jll - 474.944 KiB
├── PCRE_jll - 4.222 MiB
├── Pango_jll - 4.738 MiB
├── Pixman_jll - 5.945 MiB
├── Wayland_protocols_jll - 465.663 KiB
ERROR: LoadError: IOError: open("c:\\Users\\amroa\\OneDrive - polymtl.ca\\AmroAlsabbagh\\Code\\Julia\\Tests and Notes\\12- Create exe file\\NetlistEMTP\\Net\\generated\\share\\julia\\artifacts\\f42f9d226c70f0bc88e5f897d914d9de1ac2ce03\\share\\wayland-protocols\\unstable\\fullscreen-shell\\fullscreen-shell-unstable-v1.xml", 769, 33206): no such file or directory (ENOENT)
Stacktrace:
[1] uv_error
# .\libuv.jl:97 [inlined]
[2] open(path::String, flags::UInt16, mode::UInt64)
# Base.Filesystem .\filesystem.jl:87
[3] sendfile(src::String, dst::String)
# Base.Filesystem .\file.jl:956
[4] cptree(src::String, dst::String; force::Bool, follow_symlinks::Bool)
# Base.Filesystem .\file.jl:331
[5] cptree(src::String, dst::String; force::Bool, follow_symlinks::Bool) (repeats 4 times)
# Base.Filesystem .\file.jl:328
[6] cp(src::String, dst::String; force::Bool, follow_symlinks::Bool)
# Base.Filesystem .\file.jl:353
[7] cp
# .\file.jl:349 [inlined]
[8] bundle_artifacts(ctx::Pkg.Types.Context, dest_dir::String; include_lazy_artifacts::Bool)
# PackageCompiler C:\Users\amroa\.julia\packages\PackageCompiler\wpsGv\src\PackageCompiler.jl:1124
[9] create_app(package_dir::String, app_dir::String; executables::Vector{Pair{String, String}}, precompile_execution_file::Vector{String}, precompile_statements_file::Vector{String}, incremental::Bool, filter_stdlibs::Bool, force::Bool, c_driver_program::String, cpu_target::String, include_lazy_artifacts::Bool, sysimage_build_args::Cmd,
include_transitive_dependencies::Bool)
# PackageCompiler C:\Users\amroa\.julia\packages\PackageCompiler\wpsGv\src\PackageCompiler.jl:699
[10] top-level scope
# c:\Users\amroa\OneDrive - polymtl.ca\AmroAlsabbagh\Code\Julia\Tests and Notes\12- Create exe file\NetlistEMTP\Net\packagecompiler.jl:25
in expression starting at c:\Users\amroa\OneDrive - polymtl.ca\AmroAlsabbagh\Code\Julia\Tests and Notes\12- Create exe file\NetlistEMTP\Net\packagecompiler.jl:25

Related

Why does bazel's rules_foreign_cc make not find/create the artifacts?

I want to create a make rule from rules_foreign_cc.
But even the minimal example below is causing issues for me.
With the following setup:
.
├── BUILD (empty)
├── hello
│   ├── BUILD.bazel
│   ├── hello.c
│   ├── Makefile
│   └── WORKSPACE (empty)
└── WORKSPACE
WORKSPACE:
workspace(name = "test")
load("#bazel_tools//tools/build_defs/repo:http.bzl", "http_archive")
http_archive(
name = "rules_foreign_cc",
sha256 = "2a4d07cd64b0719b39a7c12218a3e507672b82a97b98c6a89d38565894cf7c51",
strip_prefix = "rules_foreign_cc-0.9.0",
url = "https://github.com/bazelbuild/rules_foreign_cc/archive/refs/tags/0.9.0.tar.gz",
)
load("#rules_foreign_cc//foreign_cc:repositories.bzl", "rules_foreign_cc_dependencies")
# This sets up some common toolchains for building targets. For more details, please see
# https://bazelbuild.github.io/rules_foreign_cc/0.9.0/flatten.html#rules_foreign_cc_dependencies
rules_foreign_cc_dependencies()
local_repository(
name = "hello",
path = "hello",
)
hello/BUILD.bazel:
load("#rules_foreign_cc//foreign_cc:defs.bzl", "make")
filegroup(
name = "hellosrc",
srcs = glob([
"**",
]),
)
make(
name="hello_build",
lib_source=":hellosrc",
out_bin_dir="",
out_binaries=["hello_binary"],
targets=["all"],
)
hello/Makefile:
all:
gcc hello.c -o hello_binary
clean:
rm hello
hello/hello.c:
#include <stdio.h>
int main () {
printf("hello\n");
return 0;
}
and running
bazel build #hello//:hello_build
I'm getting
INFO: Analyzed target #hello//:hello_build (43 packages loaded, 812 targets configured).
INFO: Found 1 target...
ERROR: /home/timotheus/.cache/bazel/_bazel_timotheus/a791a0a19ff4a5d2730aa0c8954985c4/external/hello/BUILD.bazel:10:5: output 'external/hello/hello_build/hello_binary' was not created
ERROR: /home/timotheus/.cache/bazel/_bazel_timotheus/a791a0a19ff4a5d2730aa0c8954985c4/external/hello/BUILD.bazel:10:5: Foreign Cc - Make: Building hello_build failed: not all outputs were created or valid
Target #hello//:hello_build failed to build
Use --verbose_failures to see the command lines of failed build steps.
INFO: Elapsed time: 12.426s, Critical Path: 12.22s
INFO: 7 processes: 5 internal, 2 linux-sandbox.
FAILED: Build did NOT complete successfully
Basically, I have no idea where Bazel is looking for the created binaries (are they even created?). I tried to set out_bin_dir to different values or not set it at all, all with the same effect.
I expect Bazel to generate the binary and find it - or at least give me a hint what it does.
And I think I found the solution.
make of rules_foreign_cc expects that the make project has an install target.
If that's not the case, it doesn't find the binaries.
This is how I could fix my minimal example - adding an install target to the Makefile
install:
cp -rpv hello_binary $(PREFIX)

Overwriting hydra configuration groups from CLI

I am trying to overwrite from the CLI a group of parameters and I am not sure how to do it. The structure of my conf is the following
conf
├── config.yaml
├── optimizer
│ ├── adamw.yaml
│ ├── adam.yaml
│ ├── default.yaml
│ └── sgd.yaml
├── task
│ ├── default.yaml
│ └── nlp
│ ├── default_seq2seq.yaml
│ ├── summarization.yaml
│ └── text_classification.yaml
My task/default looks like this
# #package task
defaults:
- _self_
- /optimizer/adam#cfg.optimizer
_target_: src.core.task.Task
_recursive_: false
cfg:
prefix_sep: ${training.prefix_sep}
while the optimiser/default looks like this
_target_: null
lr: ${training.lr}
weight_decay: 0.001
no_decay:
- bias
- LayerNorm.weight
and one specific optimiser, say adam.yaml, looks like this
defaults:
- default
_target_: torch.optim.Adam
In the end the config I'd like to be computed is like this
task:
_target_: src.task.nlp.nli_generation.task.NLIGenerationTask
_recursive_: false
cfg:
prefix_sep: ${training.prefix_sep}
optimizer:
_target_: torch.optim.Adam
lr: ${training.lr}
weight_decay: 0.001
no_decay:
- bias
- LayerNorm.weight
I would like to be able to modify the optimiser via the CLI (say, use sgd), but I am not sure how to achieve this. I tried, but I understand why it fails, this
python train.py task.cfg.optimizer=sgd # fails
python train.py task.cfg.optimizer=/optimizer/sgd #fails
Any tips on how to achieve this?
Github discussion here.
You can't override default list entries in this form.
See this.
In particular:
CONFIG : A config to use when creating the output config. e.g. db/mysql, db/mysql#backup.
GROUP_DEFAULT : An overridable config. e.g. db: mysql, db#backup: mysql.
To be able to override a default list entry, you need to define it as a GROUP_DEFAULT.
In your case, it might look like
defaults:
- _self_
- /optimizer#cfg.optimizer: adam

Using multiple configs in the same group to interpolate values in a yaml file

In Hydra I have the following configuration:
├── conf
│ ├── config.yaml
│ ├── callbacks
│ │ ├── callback_01.yaml
│ │ └── callback_02.yaml
│ └── trainer
│ ├── default.yaml
The callbacks have a structure like this:
_target_: callback_to_instantiate
I need to pass to the trainer/default.yaml both the callbacks through interpolation.
I tried like this:
_target_: pytorch_lightning.Trainer
callbacks:
- ${callbacks.callback_01}
- ${callbacks.callback_02}
With the config.yaml like this:
defaults:
- _self_
- trainer: default
I did also other trials but it doesn't seem to work. Is there a way to interpolate like that in a yaml file by using two or more yaml files that are in the config group?
I would like if possible to keep this structure.
Currently the recommended approach is:
compose a mapping whose values are the desired callbacks, and then
use the oc.dict.values OmegaConf resolver to get a list of values from that dictionary.
# conf/config.yaml
defaults:
- callbacks#_callback_dict.cb1: callback_01
- callbacks#_callback_dict.cb2: callback_02
- trainer: default
- _self_
# conf/trainer/default.yaml
_target_: pytorch_lightning.Trainer
callbacks: ${oc.dict.values:_callback_dict}
# my_app.py
from typing import Any
import hydra
from omegaconf import DictConfig, OmegaConf
#hydra.main(config_path="conf", config_name="config")
def app(cfg: DictConfig) -> Any:
OmegaConf.resolve(cfg)
del cfg._callback_dict
print(OmegaConf.to_yaml(cfg))
if __name__ == "__main__":
app()
At the command line:
$ python my_app.py
trainer:
_target_: pytorch_lightning.Trainer
callbacks:
- _target_: callback_to_instantiate_01
- _target_: callback_to_instantiate_02
For reference, there is an open issue on Hydra's github repo advocating for an improved user experience around

Interpolation using the selected config group option in Hydra

I am using hydra composition with the following structure:
├── configs
│   ├── config.yaml
│   ├── data
│   │   ├── dataset_01.yaml
│   │   └── dataset_02.yaml
│   └── model
│   ├── bert.yaml
│   └── gpt.yaml
config.yaml
defaults:
- model: bert
- data: dataset_01
...
data/dataset_01.yaml
# #package _group_
name: "dataset_01"
train:
path: "../resources/datasets/dataset_01/train.jsonl"
num_samples: 1257391
test:
path: "../resources/datasets/dataset_01/test.jsonl"
num_samples: 71892
val:
path: "../resources/datasets/dataset_01/val.jsonl"
num_samples: 73805
model/bert.yaml
# #package _group_
name: "bert"
encoder: "source.encoder.BertEncoder.BertEncoder"
encoder_hparams:
architecture: "bert-base-uncased"
lr: 1e-7
tokenizer:
architecture: "bert-base-uncased"
predictions:
path: "../resources/predictions/bert_predictions.pt"
entry point
#hydra.main(config_path="configs/", config_name="config.yaml")
def perform_tasks(hparams):
model = MyModel(hparams.model)
if __name__ == '__main__':
perform_tasks()
In the context of hparams.model, there is no way for OmegaConf to interpolate the key data.name since it is not in scope.
So, it would be great if there was an approach to causes the interpolation at the beginning of the application.
OmegaConf interpolation is absolute and is operating on the final config.
Try this:
With Hydra 1.1 or newer you can use hydra.runtime.choices which is a dictionary containing the config groups you have selected.
You will be able to interpolate without adding the name field using hydra:runtime.choices.GROUP_NAME:
predictions:
path: "dir/bert_${hydra:runtime.choices.GROUP_NAME}_pred.pt"

When do you need aggregate over dependsOn

When doing a multiproject build, you can list the projects you depend upon in dependsOn, and tasks will run on the dependencies first, so you can depend on their results.
There also is an aggregate task that, eh, aggregates subprojects. How does aggregating and depending on subprojects differ, and in what cases should you use aggregrate instead of dependsOn
The key difference is aggregate does not modify classpath and does not establish ordering between sub-projects. Consider the following multi-project build consisting of root, core and util projects:
├── build.sbt
├── core
│   ├── src
│   └── target
├── project
│   ├── build.properties
│   └── target
├── src
│   ├── main
│   └── test
├── target
│   ├── scala-2.13
│   └── streams
└── util
├── src
└── target
where
core/src/main/scala/example/Core.scala:
package example
object Core {
def foo = "Core.foo"
}
util/src/main/scala/example/Util.scala
package example
object Util {
def foo =
"Util.foo" + Core.foo // note how we depend on source from another project here
}
src/main/scala/example/Hello.scala:
package example
object Hello extends App {
println(42)
}
Note how Util.foo has a classpath dependency on Core.foo from core project. If we now try to establish "dependency" using aggregate like so
lazy val root = (project in file(".")).aggregate(core, util)
lazy val util = (project in file("util"))
lazy val core = (project in file("core"))
and then execute compile from root project
root/compile
it will indeed attempt to compile all the aggregated projects however Util will fail compilation because it is missing a classpath dependency:
sbt:aggregate-vs-dependsOn> root/compile
[info] Compiling 1 Scala source to /Users/mario/IdeaProjects/aggregate-vs-dependson/target/scala-2.13/classes ...
[info] Compiling 1 Scala source to /Users/mario/IdeaProjects/aggregate-vs-dependson/core/target/scala-2.13/classes ...
[info] Compiling 1 Scala source to /Users/mario/IdeaProjects/aggregate-vs-dependson/util/target/scala-2.13/classes ...
[error] /Users/mario/IdeaProjects/aggregate-vs-dependson/util/src/main/scala/example/Util.scala:5:18: not found: value Core
[error] "Util.foo" + Core.foo // note how we depend on source from another project here
[error] ^
[error] one error found
[error] (util / Compile / compileIncremental) Compilation failed
[error] Total time: 1 s, completed 13-Oct-2019 12:35:51
Another way of seeing this is to execute show util/dependencyClasspath which should have Core dependency missing from output.
On the other hand, dependsOn will modify classpath and establish appropriate ordering between projects
lazy val root = (project in file(".")).aggregate(core, util)
lazy val util = (project in file("util")).dependsOn(core)
lazy val core = (project in file("core"))
Now root/compile gives
sbt:aggregate-vs-dependsOn> root/compile
[info] Compiling 1 Scala source to /Users/mario/IdeaProjects/aggregate-vs-dependson/target/scala-2.13/classes ...
[info] Compiling 1 Scala source to /Users/mario/IdeaProjects/aggregate-vs-dependson/core/target/scala-2.13/classes ...
[info] Compiling 1 Scala source to /Users/mario/IdeaProjects/aggregate-vs-dependson/util/target/scala-2.13/classes ...
[success] Total time: 1 s, completed 13-Oct-2019 12:40:25
and show util/dependencyClasspath shows Core on the classpath:
sbt:aggregate-vs-dependsOn> show util/dependencyClasspath
[info] * Attributed(/Users/mario/IdeaProjects/aggregate-vs-dependson/core/target/scala-2.13/classes)
Finally, aggregate and dependsOn are not mutually exclusive, in fact, it is common practice to use both at the same time, often aggregate on the root all sub-projects to aid building, whilst using dependsOn to handcraft particular orderings for particular sub-projects.

Resources