Can Premake package a project? - premake

I have a Premake 5 project that builds a static library (primarily via the gmake2 or vs2017 actions). I'd like to create a build target for the release material itself: the library zipped up with some text and header files.
How do I do this with Premake? I tried something like:
project "thing_export"
kind "Utility"
os.mkdir "Release"
-- etc
...but of course the os... functions are executed when Premake runs, they don't end up in the build files.
I also tried using custom build commands:
project "thing_export"
kind "Utility"
buildmessage "Creating release"
buildcommands { "{copy} README.md '%{prj.location}'" }
buildoutputs { "'%{prj.location}/README.md'" }
The makefile generated by the gmake or gmake2 actions is the same as for an empty project. No actions are generated for it.
How can I use Premake to create my releases?

Depending on what you need to do, you might be able to make that approach work. What I usually do (and what Premake itself does) is write a Lua script to automate the release packaging, and use a custom Premake action to run it.
Have a look at Premake's package.lua script to see how it automates the packaging. And here is the custom action that calls package.lua:
newaction {
trigger = "package",
description = "Creates source and binary packages",
execute = function ()
include (path.join(corePath, "scripts/package.lua"))
end
}

Related

Global deno.json configuration file?

I want to apply this deno.json configuration file to all my deno projects:
{
"fmt": {
"options": {
"indentWidth": 4
}
}
}
Is there a way to globally apply this configuration so I don't have to have this deno.json file in every project?
I'm using VSCode, Ubuntu and Deno 1.28.1.
Because of the way that the Deno VS Code extension overrides/suppresses the built-in TS language server, it is not advised to enable the extension globally: this would cause problems in every non-Deno TypeScript project.
That said, you can create a single deno.json(c) file at a high-level location in your filesystem — for example: in your home directory. To use a concrete example location — on Linux — /home/your_username/deno.json.
Then, when configuring a new VS Code project, you only need to configure the location of the config file in .vscode/settings.json in order for the extension to use it:
{
"deno.enable": true,
"deno.config": "/home/your_username/deno.json"
}
When using Deno in the CLI, it will automatically walk your filesystem and find the nearest parent config file. From the manual:
Since v1.18,
Deno will automatically detect deno.json or deno.jsonc configuration file if
it's in your current working directory (or parent directories).
Regardless of the above, this strategy is not advised: a better approach might be to simply to create a personal CLI script/function which will generate a new deno config and VS Code config from a template that you create. This way, each of your projects maintains its own configuration data (a good thing) and you also don't have to manually configure each new one because you did the work once to create the template generation script (win-win).

How can i use the modules written for Frama-C' s plugin?

Build is a module which has been developed in order to build the PDG.
I wrote a script which uses this module Build but when i try to launch this script with:
frama-c -load-script test.ml
I get the mistake: Unbound module Build.
Is there a way to get access to this module. I need it in my project.
Build is an example but there are another modules like Sets which provides functions to read a PDG. However, others modules like PdgTypes don't make mistakes. If anybody could help me...
In my file test.ml,
let compute = Build.compute_pdg
....
let () = Db.Main.extend main
You can't do that. -load-script can only work for script that do not have any dependency outside of Frama-C (or Frama-C's own dependencies such as OCamlgraph). As suggested by Anne, if your code is contained in more than one file, you should write it as a plugin.
For a simple plugin, you basically only have to write a short Makefile in addition to your OCaml code. This Makefile will mainly contain the list of source files of your plugin and a few additional information (such as the plugin's name), as explained in the developer's manual, which contain a small tutorial.
Alternatively, if you have only two files, it should be possible to assemble them manually into a single module that can then be loaded by Frama-C. Supposing you have build.ml and test.ml, you could do (with the Sodium
version)
ocamlopt -I $(frama-c-config -print-libpath) -c build.ml
ocamlopt -I $(frama-c-config -print-libpath) -c test.ml
ocamlopt -shared -o script.cmxs build.cmx test.cmx
frama-c -load-module script.cmxs [other options] [files]
The modules you refer to, Build and Sets, are not considered as being part of the public user interface of Frama-C. Instead, they are internal to the plugin PDG. The modules of PDG you can access from user scripts are those in the directory src/pdgTypes: PdgIndex, PdgMarks and PdgTypes. Then, a second part of the API is available inside Db.Pdg (Db is in src/kernel/db.ml). In particular, most of the functions of the module Sets are re-exported there.
For the functions available inside Build, they have been deemed too low-level to be exported. If you really need to access it, you will have to copy the directory src/pdg and transform it into a plugin (with a new name, to avoid clashes).

How to define directory structure following packages in project's Scala build definitions?

There are two full build definition files in sbt project: Build.scala and Helpers.scala. They are located in project folder.
I'd like to put Helpers module into separate sub-folder project/utils. When I do import utils.Helpers in Build.scala it says:
not found: object utils
Is it possible to define directory structure that follows the packages in sbt full build definitions?
you should use project/src/main/scala/utils instead of project/utils
Sbt builds are recursive, which means that sbt build definition is built by sbt, applying the same rules as per normal project.
Unlike Java, Scala has no strict relation between the package and folder structure. Meaning you can place your sources wherever you like and it doesn't have to match package declaration. Scala will not complain.
Sbt knows where to search for folders by checking sourceDirectories setting key.
You can check it easily by executing show sourceDirectories. However this will show the sourceDirectories for your actual project. How you can check it for the build? Quite easily, execute reload plugins, this will take you to your build. Execute show sourceDirectories, and it should show you that it looks for sources in /project/src/main/scala, project/src/main/java and one more, which is managed sources (doesn't matter for our case). Now you can execute reload return to go back to your main project.
Given that you should be able to create an object let's say, named Helpers in project/src/main/scala/utils/Helpers.scala:
package utils
object Helpers {
def printFancy(name: String) = println(s">>$name<<")
}
And use it in your Build.scala:
import sbt._
import Keys._
import utils.Helpers._
object MyBuild extends Build {
val printProjectName = taskKey[Unit]("Prints fancy project name")
lazy val root = project.in(file(".")).settings(
printProjectName := printFancy(name.value)
)
}
You can test it by executing printProjectName.
> printProjectName
>>root<<
[success] Total time: 1 s, completed May 29, 2014 1:24:16 AM
I've stated earlier that sbt is recursive. This means, that if you want, you can use the same technique to configure the sbt build, as you use for configuring building of your own project.
If you don't want to keep your files under /project/src/main/scala, but just under /project/utils, you can do so by creating build.sbt in your project folder, with following content:
unmanagedSourceDirectories in Compile += baseDirectory.value / "utils"
Just as it is described in the documentation
Now even if you place your utils in project/utils sbt should be able to find it.

What are key differences between sbt-pack and sbt-assembly?

I've just stumbled upon the sbt-pack plugin. The development stream seems steady. It's surprising to me as I believed that the only plugin for (quoting sbt-pack's headline) "creating distributable Scala packages." is sbt-assembly (among the other features).
What are the key differences between the plugins? When should I use one over the other?
(Disclaimer: I maintain sbt-assembly)
sbt-assembly
sbt-assembly creates a fat JAR - a single JAR file containing all class files from your code and libraries. By evolution, it also contains ways of resolving conflicts when multiple JARs provide the same file path (like config or README file). It involves unzipping of all library JARs, so it's a bit slow, but these are heavily cached.
sbt-pack
sbt-pack keeps all the library JARs intact, moves them into target/pack directory (as opposed to ivy cache where they would normally live), and makes a shell script for you to run them.
sbt-native-packager
sbt-native-packager is similar to sbt-pack but it was started by a sbt committer Josh Suereth, and now maintained by highly capable Nepomuk Seiler (also known as muuki88). The plugin supports a number of formats like Windows msi file and Debian deb file. The recent addition is a support for Docker images.
All are viable means of creating deployment images. In certain cases like deploying your application to a web framework etc., it might make things easier if you're dealing with one file as opposed to a dozen.
Honorable mention: sbt-progard and sbt-onejar.
Although Eugene Yokota's explanation is complete, I would like to explain the mentioned plugins with package command in the aspect of usages and how different results are generated.
Directory settings and build.sbt
lazy val commonSettings = Seq(
organization := "stackOverFlow",
scalaVersion := "2.11.12",
version := "1.0",
)
lazy val app = (project in file ("app")).
enablePlugins(PackPlugin).
settings(commonSettings)
Above build.sbt file declares project called app and includes all the source files in the app directory. To enable Pack plugins, enablePlugins(PackPlugin) should be included in the sbt file.
Also, I've put the below line in project/plugins.sbt file to use pack plugins in our project
addSbtPlugin("org.xerial.sbt" % "sbt-pack" % "0.9.3")
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.5")
The package is already integrated into the sbt by default, so you don't have to explicitly specify the plugins using addSbtPlugins. However, the sbt-pack and sbt-assembly plugins are not included in the sbt by default, so you have to specify that you want to use them. addSbtPlugin is a way to say that "I want to use xxx, yyy plugins in my project" to your sbt.
Also, I implemented two contrived scala files in the ./app/src/main/scala:
AppBar.scala
class AppBar {
def printDescription() = println(AppBar.getDescription)
}
object AppBar {
private val getDescription: String = "Hello World, I am AppBar"
def main (args: Array[String]): Unit = {
val appBar = new AppBar
appBar.printDescription()
}
}
AppFoo.scala
class AppFoo {
def printDescription() = println(AppFoo.getDescription)
}
object AppFoo {
private val getDescription: String = "Hello World, I am AppFoo"
def main (args: Array[String]): Unit = {
val appFoo = new AppFoo
appFoo.printDescription()
}
}
sbt package
This is very basic sbt command included in the sbt to help you distribute your project through the jar file. The jar file generated by the package command is located in the projectDirectoy/target/scala-2.11/app_2.11-1.0.jar (Here, the specified scalaVersion and version setting keys included in the build.sbt file are used to generate the jar file name).
When you look inside the jar, you can see the class files generated by the sbt tool, which is the result of compiling the sources in the app/src/main/scala. Also, it includes a MANIFEST file.
$vi projectDirectoy/target/scala-2.11/app_2.11-1.0.jar
META-INF/MANIFEST.MF
AppBar$.class
AppBar.class
AppFoo.class
AppFoo$.class
Note that it only includes the class files generated from the scala files located in the app/src/main/scala directory. The jar file generated by the package command does not include any scala related libraries such as collection in the scala library (e.g., collection.mutable.Map.class). Therefore, to execute the program you may require scala library because the generate jar file only contains the minimal classes generated from the scala sources that I implemented. That is the reason why the jar file contains AppBar.class, AppBar$.class for companion object, etc.
sbt-assembly
As mentioned by the Eugene Yokota, sbt-assembly also help you distribute your project through generating the jar file; however the generated jar file includes not only the class files generated by your source code, but also all the libraries that you need to execute the program. For example, to execute the main function defined in the AppFoo object, you may need scala libraries. Also, when you add external libraries in your project, which can be included by adding the dependencies to the libraryDependencies key.
libraryDependencies ++= Seq("org.json4s" %% "json4s-jackson" % "3.5.3")
For example, you can include json4s libraries in your project, and jar files related to supporting json4s in your project also will be added to the final jar file generated by the sbt-assembly. In other words, when you invoke assembly in your sbt, it generates one jar file containing all the requirements to execute your program, so that you don't need another dependency to execute yout program.
When you prompt assembly command in your sbt shell, then it will generate one jar file in your target directory. In this case, you may find the app-assembly-1.0.jar in the app/target/scala-2.11 directory. When you look inside the jar file, you can find that it contains lots of classes.
$vi projectDirectoy/target/scala-2.11/app_2.11-1.0.jar
ETA-INF/MANIFEST.MF
scala/
scala/annotation/
scala/annotation/meta/
scala/annotation/unchecked/
scala/beans/
scala/collection/
scala/collection/concurrent/
scala/collection/convert/
scala/collection/generic/
scala/collection/immutable/
scala/collection/mutable/
scala/collection/parallel/
scala/collection/parallel/immutable/
scala/collection/parallel/mutable/
scala/collection/script/
scala/compat/
scala/concurrent/
scala/concurrent/duration/
scala/concurrent/forkjoin/
scala/concurrent/impl/
scala/concurrent/util/
scala/io/
scala/math/
scala/ref/
scala/reflect/
scala/reflect/macros/
scala/reflect/macros/internal/
scala/runtime/
scala/sys/
scala/sys/process/
scala/text/
scala/util/
scala/util/control/
scala/util/hashing/
scala/util/matching/
AppBar$.class
AppBar.class
AppFoo$.class
AppFoo.class
......
As mentioned before, because the jar file generated by the assembly contains all the dependencies such as scala and external libraries to execute your program in the jar, you may think that you can invoke the main functions defined in the AppFoo object and AppBar object.
jaehyuk#ubuntu:~/work/sbt/app/target/scala-2.11$ java -cp './*' AppFoo
Hello World, I am AppFoo
jaehyuk#ubuntu:~/work/sbt/app/target/scala-2.11$ java -cp './*' AppBar
Hello World, I am AppBar
Yeah~ you can execute the main function using the generated jar file.
sbt-pack
sbt-pack is almost same as the sbt-assembly; it saves all the library on which your project depends as jar files required to execute your program. However, sbt-pack doesn't integrate all the dependencies into one jar files, instead, it generates multiple jar files which correspond to one library dependencies and your classes (e.g., AppFoo.class).
Also, interestingly it automatically generates scripts for invoking all the main functions defined in your scala source files and Makefiles to install the program. Let's take a look at the pack directory created after you prompt pack command on your sbt shell.
jaehyuk#ubuntu:~/work/sbt/app/target/pack$ ls
bin lib Makefile VERSION
jaehyuk#ubuntu:~/work/sbt/app/target/pack$ ls bin/
app-bar app-bar.bat app-foo app-foo.bat
jaehyuk#ubuntu:~/work/sbt/app/target/pack$ ls lib/
app_2.11-1.0.jar sbt_2.12-0.1.0-SNAPSHOT.jar scala-library-2.11.12.jar
jaehyuk#ubuntu:~/work/sbt/app/target/pack$
As shown in the above, two directories and two files are created; bin contains all the script files to execute the functions defined in your sources (each file is a script that helps you execute the main method defined in your scala files); lib contains all the required jar files to execute your program; and lastly Makefile can be used to install your program and dependent libraries in your system.
For the details, please refer the github pages for each plugins.

source_dirs doesn't work in .gpr scipt

I've inherited an Ada/C++ project and I'm trying to use gprbuild to automate the build process (which was previously done with a set of about 12 .bat files). I'm totally new to Ada and gprbuild, but have actually made pretty good progress. I can compile the .exe's that I need, but not the library. I am not at liberty to completely share the .gpr file, but the relevant parts look like this:
[snip]
for Source_Dirs use (
"c_plus_plus_files",
"ada_files",
"..\another_project\some_other_ada_files",
"..\another_project\even_more_ada_files"
);
[snip]
for Source_Files use (
"my_ada_file.ads",
"another_ada_file.ads",
"one_more_ada_file.adb",
"c_plus_plus_file.cpp"
);
[snip]
When I run "gprbuild -P my_project.gpr" it in turn runs "gcc -c gnat5 one_more_ada_file.adb" and complains that it cannot find a certain file that one_more_ada_file.adb depends on. The dependency is in ..\another_project\even_more_ada_files, so I would expect it to be found. But if I copy the dependency into the same folder as one_more_ada_file.adb, the error goes away.
Because of how the VCS is setup and how we're sharing code between two projects, I'd much rather figure out what's wrong with how I'm using "for source_dirs use" than to keep multiple copies of all the ada files.
Again, I'm an Ada/GPS newb, so if I'm leaving out relevant information, please let me know.
Update: It appears that the specific problem isn't that source_dirs isn't doing anything at all, but that it doesn't handle having two source dirs where .ads files in one dir depend on .ads files in the other. That is, even within my "other" project above, an .ads file in some_other_ada_files that depends on an .ads file in even_more_ada_files doesn't get compiled with the gcc -c -gnat05 command when I run gprbuild (error: the file in even_more_ada_files not found), but it does get compiled if I run the gcc command by hand (or in a .bat script) with two -I flags, one for each directory.
When dealing with multiple projects, you should normally create a .gpr-file for each project, and let your projects depend on the other projects as needed.
Thus:
project another_project is
for Source_Dirs use
("some_other_ada_files",
"even_more_ada_files");
end another_project;
and then:
with "..\another_project\another_project.gpr"
project The_Project is
for Source_Dirs use
("c_plus_plus_files",
"ada_files");
end The_Project;

Resources