I defined some classes in sbt's project directory using no package (i.e. all my files were directly under project and they did not include any package statement). It worked fine.
Now when I tried to group them into packages and ran sbt reload I got not found: value XXX at the line I imported the package in my build.sbt (XXX is the name of the package).
Can't project deal with packages?
EDIT after comment
It will work if you add your source files in folder project/src/main/scala
Check this structure
tree
.
├── build.sbt
└── project
├── build.properties
└── src
└── main
└── scala
└── foo
└── Bar.scala
5 directories, 3 files
build.sbt
import foo._
version := Bar.ver
and Bar.scala
package foo
object Bar {
val ver = "1.0.0"
}
Related
I recently need to integrate some projects as submodules, and each of them has its own build.sbt.
And there are also dependencies between submodules.
Therefore, I need to dynamically convert the libraryDependencies to dependsOn format.
The project structure looks like this, and root depends on A and B, while A also depends on B.
.
├── build.sbt
├── projA
│ ├── build.sbt
│ └── src
├── projB
│ ├── build.sbt
│ └── src
└── src
└── main
Since it seems appends .setting/.denpensOn directly to a project that already has build.sbt is ineffective.
// ./build.sbt
lazy val root = (project in file("."))
.settings(commonSettings)
.dependsOn(A, B)
lazy val A = (project in file("projA"))
.settings(commonSettings) <-- not work
.dependsOn(B) <-- not work
lazy val B = (project in file("projB"))
.settings(commonSettings) <-- not work
So, I try this method to dynamically edit libraryDependencies, but this approach is only able to modify settings.
The inter-project dependencies seem to be not controlled by the settings.
Is there any way to dynamically add dependsOn to the project?
Or how to make the depensOn appended on the subproject in the root build.sbt works?
Thanks.
I am currently building a backend using FastAPI and I am facing some issues to run the backend using poetry scripts. This is my project structure:
├── backend
└── src
└── asgi.py
└── Dockerfile
└── poetry.lock
└── pyproject.toml
pyproject.toml
[tool.poetry]
name = "backend"
version = "0.1.0"
description = ""
authors = ["Pierre-Alexandre35 <46579114+pamousset75#users.noreply.github.com>"]
readme = "README.md"
[tool.poetry.dependencies]
python = "^3.9"
uvicorn = "^0.17.6"
fastapi = "^0.78.0"
psycopg2 = "^2.9.3"
jwt = "^1.3.1"
python-multipart = "^0.0.5"
[build-system]
requires = ["poetry-core"]
build-backend = "poetry.core.masonry.api"
[tool.poetry.scripts]
foo='asgi:__main__'
If I am running poetry run python asgi.py, it is working perfectly but if I am using poetry foo script, I am getting No file/folder found for package backend. Those are all combinaisons I tried and I have the same error for every poetry run foo:
foo='asgi:main'
foo='backend.asgi:__main__'
foo='backend.asgi:main'
foo='backend.asgi:.'
Your project structure does not seem to be correct. Assuming backend is the package u are trying to create.
Use this structure
└── pyproject.toml
└── poetry.lock
└── README.md
├── backend
└── src
└── asgi.py
└── Dockerfile
└── __init__.py
Also in scripts use. (Assuming you are trying to run main with foo)
[tool.poetry.scripts]
foo='backend.asgi:__main__'
I'm new to Julia.
I'm seeking the best practice of structuring directories, packages, and projects.
Compared with Python, the most annoying parts in Julia are as follows:
The path for e.g. include seems depend on the path of executed file. I'd like to keep a specific reference path so that I can load some files easily.
I asked similar questions here, and someone told me that creating packages and using using will be easy to manage my projects.
However, It's really confusing when loading files and modules. For example,
MyProject
├── MyPkg1
│ ├── src
│ │ └── MyPkg1.jl
│ └── test
│ └── runtests.jl
└── MyPkg2
├── src
│ └── MyPkg1.jl
└── test
└── runtests.jl
6 directories, 4 files
a) MyPkg1/src/MyPkg1.jl
module MyPkg1
export func_export
function func_export
println("hi")
end
end
b) MyPkg1/test/runtests.jl
using MyPkg1
using Test
#testset "MyPkg1.jl" begin
func_export() # raise error: func_export not defined
end
c) MyPkg2/test/runtests.jl
using MyPkg2
using Test
using MyPkg1 # other Pkg
#testset "MyPkg2.jl" begin
func_export() # raise error
end
As written in the above codes, some errors are raised (see b), c)).
So my questions are...
If I did something wrong in the above example, please explain why errors occur in detail.
What's the best practice in developing Julia projects in the sense of directory structure?
I am trying to depend on RcppArmadillo in my package but I get an error unable to load shared object /tmp/Rtmp0LswYZ/Rinst82cbed4eaee/00LOCK-alt.raster/00new/alt.raster/libs/alt.raster.so: undefined symbol: dsyev_ when I try to run the command R CMD build . in my package directory. However, following the instructions on https://stackoverflow.com/a/14165455 in an interactive R session works correctly. I have also run the R -e 'Rcpp::compileAttributes()' in my package directory and it seems to generate the RcppExports.cpp correctly. What am I doing wrong?
As surmised in the comments above, it is really beneficial to start from a working example.
To create one, we offer the RcppArmadillo.package.skeleton() function. Use it as follows:
edd#rob:/tmp$ Rscript -e 'RcppArmadillo::RcppArmadillo.package.skeleton("demoPkg")'
Calling kitten to create basic package.
Creating directories ...
Creating DESCRIPTION ...
Creating NAMESPACE ...
Creating Read-and-delete-me ...
Saving functions and data ...
Making help files ...
Done.
Further steps are described in './demoPkg/Read-and-delete-me'.
Adding pkgKitten overrides.
>> added .gitignore file
>> added .Rbuildignore file
Deleted 'Read-and-delete-me'.
Done.
Consider reading the documentation for all the packaging details.
A good start is the 'Writing R Extensions' manual.
And run 'R CMD check'. Run it frequently. And think of those kittens.
Adding RcppArmadillo settings
>> added Imports: Rcpp
>> added LinkingTo: Rcpp, RcppArmadillo
>> added useDynLib and importFrom directives to NAMESPACE
>> added Makevars file with Rcpp settings
>> added Makevars.win file with RcppArmadillo settings
>> added example src file using armadillo classes
>> added example Rd file for using armadillo classes
>> invoked Rcpp::compileAttributes to create wrappers
edd#rob:/tmp$
It should create these files:
edd#rob:/tmp$ tree demoPkg/
demoPkg/
├── DESCRIPTION
├── man
│ ├── demoPkg-package.Rd
│ ├── hello.Rd
│ └── rcpparma_hello_world.Rd
├── NAMESPACE
├── R
│ ├── hello.R
│ └── RcppExports.R
└── src
├── Makevars
├── Makevars.win
├── rcpparma_hello_world.cpp
└── RcppExports.cpp
3 directories, 11 files
edd#rob:/tmp$
I have multi module sbt project:
├── build.sbt
├── bar
│ ├── build.sbt
│ └── ...
├── foo
│ ├── build.sbt
│ └── ...
└── ...
And 2 versions of build.sbt:
lazy val foo = project in(file("./foo"))
lazy val bar = project in(file("./bar"))
And second version:
lazy val foo = project in(file("./foo"))
lazy val bar = project in(file("./bar"))
lazy val root = Project(id = "root",
base = file(".")) aggregate(foo, bar)
What are the differences between these versions? Are there any pros in second version?
With your second version loaded type projects at the sbt command line and you should see that you have three of them. The one with a star at the front is the default project. It will be your root project. Commands that you type will apply to it. As it is an aggregate project the commands will be applying in turn to both foo and bar.
package is a good example command - typing it should produce a jar file that has both the foo and bar classes in it, if that is what you want.