premake external configuration for library - premake

I'm working on changing all makefiles to use premake for my project.
Since it has many other external libraries using other configuraton tools like autotools, I decided just to use its own Makefile rather than making makeifle by premake for some libraries. I understand "kind : Makefile" is available for what I am going to do.
The problem is that autotools ./configure command should be preceded, but I don't know how to do it by premake. Can anybody suggest to how to deal with it?
Thanks in advance.

You can use Lua's os.execute() to call external commands.
os.setcwd("path/to/project")
os.execute("./configure")
Does that do what you need?

Related

Add Custom Scripts with Linux (Ubuntu)

I'm really enjoying Azerothcore, however I would like to add custom scripts to my server. I have found a guide on how to do that, but it is based on windows OS.
guide: http://www.ac-web.org/forums/showthread.php?145843-Trinity-How-to-add-a-c-script-to-your-core
Is there guides that are specific to Linux (Ubuntu)? Thanks in advance :)
The procedure of adding a Script to AzerothCore on Linux is the same as adding it on Windows, the only difference is the way you (re)compile the project.
So you can add your Scripts just the way you would do on Windows, then recompile the project.
As the official tutorial explains, you can just (re)compile your project by:
re-running the cmake command, for example:
cmake ../ -DCMAKE_INSTALL_PREFIX=$HOME/azeroth-server/ -DCMAKE_C_COMPILER=/usr/bin/clang -DCMAKE_CXX_COMPILER=/usr/bin/clang++ -DWITH_WARNINGS=1 -DTOOLS=0 -DSCRIPTS=1
re-running the make and make install
make -j 6;
make install
Then you're good to go.
However, AzerothCore offers a better alternative to just adding custom scripts: Modules.
Modules offer the same features of Scripts, plus the ability to keep them completely detached from the main source code. So you can for example keep them in a separate repository.

How to import the JFXtras libraries to use in an assignment project?

What's the best way of importing the JFXtras libraries (specifically, jfxtras-labs) so that I can use them in my assignment project? I need to make sure that my code is able to run on both my and my professor's computer.
I'm new to JFXtras so any guidance would be highly appreciated!
This question has nothing to do with JFXtras, but is a standard Java question: you download the appropriate jars (which is jfxtras-labs-[version].jar), plus any dependent jars (like jfxtras-common-[version].jar), and include them in Java's classpath. Build systems like Maven or Gradle make including dependencies easier, by doing the downloading and including for you.

Elastic Map Reduce External Jars

So, it is easy enough to handle external jars when using hadoop straight up. You have -libjars option that will do this for you. The question is how do you do this with EMR. There must be an easy way of doing it. I thought -cachefile option of the CLI would do it, but I couldn't get it working somehow. Any ideas anyone?
Thanks for the help.
The best luck I have had with external jar dependencies is to copy them (via bootstrap action) to /home/hadoop/lib throughout the cluster. That path is on the classpath of every host. This technique is the only one that seems to work regardless of where the code lives that accesses external jars (tool, job, or task).
One option is to have the first step in your jobflow set up the JARs wherever they need to be. Or, if they are dependencies, you can package them in with your application JAR (which is probably in S3).
FYI for newer versions of EMR /home/hadoop/lib is not used anymore. /usr/lib/hadoop-mapreduce should be used.

What use does ./configure serve (other than checking dependencies)

Why does every source package that uses a makefile come with a ./configure script, what does it do? As far as I can tell, it actually generates the makefile?
Is there anything that can't be done in the makefile?
configure is usually a result of the 'autoconf' system. It can generate headers, makefiles, really anything. 'Usually,' since some are hand-crafted.
The reason for these things is to allow source code to be compiled in disparate environments. There are many variations on Unix / Linux, with slightly different system headers and libraries. configure scripts allow code to auto-adapt.
The configure step is a sort of meta build. It generates the makefile that will work on the specific hardware / distribution you're running. For instance it determines the name of the C or C++ compiler and embeds that in the makefile.
The configure step will also frequently take a set of parameters, the values of which may determine what libraries need to be linked against. For instance if you compile Apache HTTP with SSL enabled it needs to link against more shared libraries than if you don't. Since linking is handled by the makefile, you need an extra step to create a custom makefile (rather than requiring the make command to require dozens or hundreds of options.
Everything can be done from within the makefile but some build systems were implemented otherwise.
I don't personally use configure files for my projects but I admit I mostly export Erlang & Python based projects.
I don't think of the makefile as a script, I think of it as an input to the make utility.
The configure script (as its name suggests) configures the makefile, including as you say resolving dependencies.
If only from the idea of avoiding self-modifying code, the things in the configure script don't really belong in the makefile.
the point is that autoconf autohdr automake form an integrated system the makes cross platform building on unix relatively str8forward. THe docs are really bad and there are lots of horrible gotchas but on the other hand there are a lot of working samples
When I first came across this stuff I thought - "ha I can do that with a nice clean makefile" and proceeded to rework the source that way. Big mistake. Learn to write and edit configure.ac and makefile.am files, you will be happy in the end
To answer your question. Configure is good for
is function foo available on this platform and if so which include and library do I need
letting the builder choose if they want feature wizzbang included in a nice simple consistent way

Best build system for embedded development/cross-compiling

I'm doing some development right now using dsPICs and I'm not exactly in love with MPLAB. I'm actually using Visual Studio with a makefile project. Currently I'm using SCons, which seems to work fairly well, after finding a helpful guide to setting up to use an alternate compiler. Still, I can't help but wonder, is there a better build system for this? And also, is there a better way to make Scons do this?
Just use vim, makefiles and call the MPLAB command line compiler yourself.
There are quite a few build systems that you can use:
Buildroot http://www.buildroot.net/
Buildroot-ng http://wiki.openwrt.org/
crosstool-NG http://www.crosstool-ng.org/
PTXdist http://www.ptxdist.org/
OpenEmbedded http://www.openembedded.org/
OE-lite http://oe-lite.org/
muddle https://code.google.com/p/muddle/
Poky http://pokylinux.org/
OpenBricks http://www.openbricks.org/
Yocto Project http://www.yoctoproject.org/
Scratchbox http://www.scratchbox.org/
Cross Linux From Scratch http://www.cross-lfs.org/
Aboriginal Linux http://landley.net/aboriginal/
The very simplest way to do embedded development is to use your favourite code editor for writing the code, then switch to the compiler's IDE to build and download the code to the processor.
Obviously, the code editor and the compiler IDE may be the same thing, which is even simpler!

Resources