I'm trying to have my compiled language files copied into my output directory during the build process. I've got the copying down, but not the creating of the directory. After a lot of googling I came up with this:
LANGDIR = $$OUT_PWD
win32:CONFIG(debug, release|debug):LANGDIR = $$LANGDIR/debug/lang
win32:CONFIG(release, release|debug):LANGDIR = $$LANGDIR/release/lang
makeLang.commands += $${QMAKE_MKDIR} $$shell_path($${LANGDIR})
first.depends = $(first) makeLang
export(first.depends)
export(makeLang.commands)
QMAKE_EXTRA_TARGETS += first makeLang
This does the job for the most part, however, when the directory lang already exists, the build process fails. I know QMAKE_CHK_DIR_EXISTS exists, but I have no clue how to use that as a conditional. I figured perhaps it's something like this !$${QMAKE_CHK_DIR_EXISTS} $$shell_path($${LANGDIR}) : $${QMAKE_MKDIR} $$shell_path($${LANGDIR}) but that just crashes jom.exe; didn't really expect that to work anyways.
I'm also open to suggestions for better ways to do what I'm trying to do. Ideally the whole thing should be platform independent so I can have artifacts generated in my CI pipeline that contain the language files.
You should use QMAKE_MKDIR_CMD which creates the directory only if it doesn't exist.
Related
I need to include a file when I build a package, but the file itself is not required at all at runtime.
I have been told to look at how to do it using a do_ function, but have been unable to find a suitable function in any documentation. I would assume this could be done trivially, but simply specifying which file to add.
Again, I'm not looking for a way to add the files to the final image. This is just for building.
Here is full documentation about tasks. They all start with do_, so you were advised to extend one of them. Also (as written in docs) packages may have additional tasks, that come from classes - files .bbclass, that are applied to your package's recipe with inherit keyword. To get full list of exactly your package tasks use command
bitbake -c listtasks <your package name>
Actually this way you just call task do_listtasks of your package, do_listtasks is also... a task)
So.. first you need to understand to what task you need to append your file. Tasks have pretty straightforward names, so to use your file during compilation, you need task do_compile, and so on.
Now, it is not clear, how you are going to actually add your file to the build, but looks like also somehow by recipe of your package. That means you should place your file in folder files (there are options about this folder naming) next by your recipe, than add file to variable SRC_URI like this:
SRC_URI = " \
.....
file://your-file-in-folder-files.ext \
.....
"
By doing this, you tell bitbake to append this file to WORKDIR during do_unpack task. More on this here, especially in 3.3.5.
Now to actually your question)) Suppose you need to use your file during do_compile task, but before the actual compilation actions. Than in your recipe extend do_compile like this:
do_compile_prepend() {
cp ${WORKDIR}/your-file-in-folder-files.ext ${S}/some/subpath/inside/sources
}
Similarly to do something with your file after some actual task actions, create function with _append suffix. For example, do_configure_append or do_compile_append.
Here WORKDIR is some folder under Yocto build directory, where all your package stuff, needed for build your package, is placed, S is a special folder with downloaded and unpacked source code under WORKDIR and do_compile_prepend (and others) is just a bash-script.
Also consider folder D (it is also locates under WORKDIR). It contains files, that actually shall go into resulting image during image creating. So if somehow your file finds the way to the resulting image, remove it directly from D like this:
do_install_append() {
rm -f ${D}/path/to/your/file/in/resulting/rootfs/your-file-in-folder-files.ext
}
Well... this is more overview, than a exact answer, but hope this helps!
I've been struggling all afternoon to track down an issue with the Qt VS Tools in Visual Studio 2013. I'm trying to update an existing .vcxproj file that used a home-grown mechanism for generating MOC, UIC, etc. files to use the Qt VS Tools mechanism instead.
The problem I'm having is in the MOC command that's getting generated for .h files that include the Q_OBJECT macro. A sample line (reduced for brevity) is here:
<Command Condition="'$(Configuration)|$(Platform)'=='Debug|x64'">"$(QTDIR)\bin\moc.exe" "%(FullPath)" -o ".\GeneratedFiles\$(ConfigurationName)\moc_%(Filename).cpp" "-I$(QTDIR)\include\QtGui" "-I$(NOINHERIT)"</Command>
The problem is that NOINHERIT doesn't exist, so the "-I$(NOINHERIT)" gets evaluated to "-I" without a value, and the MOC compiler complains and doesn't generate the MOC file. I've tried cleaning up inherited paths, checking and unchecking the "Inherit from parent or project defaults", and the only change I sometimes see is that it has "-I" without the NOINHERIT macro.
Completely starting over with a new .vcxproj file is beginning to feel like my only hope, but that's a much larger task than I'd like to take since there's a significant number of them with interdependencies that I'd rather not create again.
I'm using the latest Qt VS Tools, which is version 2.3.2. Any ideas on how to resolve this?
Naturally, five minutes after I post, I found the issue. An included property file had this:
<AdditionalIncludeDirectories></AdditionalIncludeDirectories>
Rather than this, which solved the problem:
<AdditionalIncludeDirectories>%(AdditionalIncludeDirectories)</AdditionalIncludeDirectories>
Interestingly, and for what it's worth, this did not work:
<AdditionalIncludeDirectories />
Add the %(AdditionalIncludeDirectories) at project->C/C++ -> General -> Additional Include Directories.
%(AdditionalIncludeDirectories) is added by default, but if for some reason, this is overwrite by mistake, then we will get an error as such.
Moc'ing XXXXXXX.h...
Missing value after '-I'.
I've got a strange behavior of qmake while trying to write smth to another file. I read all possible manuals and searched out the internet but found nothing similiar. Closer to the simplest possible code:
!system(echo 1 > d:\1.txt) {
warning(Cant create a file)
}
It doesn't create a file, and it doesn't show a warning, that means that operation succeded. Another example:
var = test string
file = $$absolute_path(d:\1.txt)
message(Variable: $$var and filename: $$file)
!write_file($$pathBat, pathtowrite) {
warning(Cant create a file)
}
This block produces output:
Project MESSAGE: Variable: test string and filename: d:/1.txt
And nothing is said about the fact the file has not been created.
I've already checked the rights to write to the directory: everything seems fine.
Can anybody help me with this?
UPD:
I've found something else: message($$system(echo 1 > 1.txt)) works fine. And this is what makes me cry, because I really don't understand what is going on.
Huh! I found the solution and it may sounds like buy yourself some brain. I thought, that qmake is launched every time the project file is changed (output of message commands proved my thoughts), but it's not really the way things happen in Qt.
I don't know how exactly, but it parses the .pro file, does only some necessary operations, and as I can see system() (which is going to change some file), write_file() commands doesn't seem to be invoked.
The SOLUTION is so much simple: natively launch qmake using the Build - Launch qmake.
How can a sourced or Sweaved file find out its own path?
Background:
I work a lot with .R scripts or .Rnw files.
My projects are organized in a directory structure, but the path of the project's base directory frequently varies between different computers (e.g. because I just do parts of data analysis for someone else, and their directory structure is different from mine: I have projects base directories ~/Projects/StudentName/ or ~/Projects/Studentname/Projectname and most students who have just their one Project usually have it under ~/Measurements/ or ~/DataAnalysis/ or something the like - which wouldn't work for me).
So a line like
setwd (my.own.path ())
would be incredibly useful as it would allow to ensure the working directory is the base path of the project regardless of where that project actually is. Without the need that the user must think of setting the working directory.
Let me clarify: I look for a solution that works with pressing the editor's/IDE's source or Sweave Keyboard shortcut of the unthinking user.
Just FYI, knitr will setwd() to the dir of the input file when (and only when) evaluating the code chunks, i.e. if you call knit('path/to/input.Rnw'), the working dir will be temporarily switched to path/to/. If you want to know the input dir in code chunks, currently you can call an unexported function knitr:::input_dir() (I may export it in the future).
Starting from gsk3's Seb's suggestions, here's an idea:
the combination of username (login) and IP or name of the computer could be used to select the right directory.
That leads to something like:
setwd (switch (paste (Sys.info () [c ("user", "nodename")], collapse="."),
user.laptop = "~/Messungen",
user2.server = "~/Projekte/Projekt/",
))
So there is an automatic solution, that
works with source
works with Sweave
even works for interactive sessions where the commands are sent line by line
the combination of user and nodename of course needs to be specific
the paths need to be edited by hand, though.
Improvements welcome!
Update:
Gabor Grothendieck answered the following to a related question on r-help today:
this.dir <- dirname(parent.frame(2)$ofile)
setwd(this.dir)
which will work for source.
Another update: I now do most of the data analysis work in RStudio. RStudio's projects basically solve the problem: RStudio changes the working directory to the project root directory every time I switch between projects.
I can therefore put the project directory as far down my directory tree as I want (and the students can also put their copy wherever they want) and sync the data files and scripts/.Rnws via version control (We use a private git server). The RStudio project files are kept out of the version control, i.e. .gitignore contains .Rproj.user.
Obviously, within the project, the directory structure needs to be synchronized.
You can use sys.calls() to get the command used to source the file. Then you need a bit of trickery using regular expressions to get the pathname, bearing in mind that source("something/filename") could have used either the absolute or relative path. Here's a first attempt at putting all the pieces together: try inserting the following lines at the top of a source file.
whereFrom=sys.calls()[[1]]
# This should be an expression that looks something like
# source("pathname/myfilename.R")
whereFrom=as.character(whereFrom[2]) # get the pathname/filename
whereFrom=paste(getwd(),whereFrom,sep="/") # prefix it with the current working directory
pathnameIndex=gregexpr(".*/",whereFrom) # we want the string up to the final '/'
pathnameLength=attr(pathnameIndex[[1]],"match.length")
whereFrom=substr(whereFrom,1,pathnameLength-1)
print(whereFrom) # or "setwd(whereFrom)" to set the working directory
It's not very robust—for instance, it will fail on windows with source("pathname\\filename"), and I haven't tested what happens if you have one file sourcing another file—but you might be able to build a solution on top of this.
I have no direct solution how to obtain the directory of the file itself but if you have a limited range of directories and directory structures you can probably use
if(file.exists("c:/somedir")==TRUE){setwd("c:/somedir")}
You could check out the pattern of the directory in question and then set the dir. Does this help you?
An additional problem is that the working directory is a global variable, which can be changed by any script, so if your script calls another script, it will have to set the wd back. In RStudio I use Session -> Set Working Directory -> To Source File Location (I know, it's not ideal), and then my script does
wd = getwd ()
...
source ("mySubDir/myOtherScript.R", chdir=TRUE); setwd (wd)
...
source ("anotherSubDir/anotherScript.R", chdir=TRUE); setwd (wd)
In this way one can maintain a stack of working directories. I would love to see this implemented in the language itself.
This answer works for source and also inside nvim-R - I have no idea if it works with knitr and similar things. Any feedback appreciated.
If you have multiple scripts source-ing each other, it is important to get the correct one. That is, the largest i for which sys.frame(i)$ofile exists.
get.full.path.to.this.sourced.script = function() {
for(i in sys.nframe():1) { # Go through all the call frames,
# in *reverse* order.
x = sys.frame(i)$ofile
if(!is.null(x)) # if $ofile exists,
return(normalizePath(x)) # then return the full absolute path
}
}
This question is for those of you who happen to use R, on a Mac, in combination with Macromate's [Textmate](http://macromates.com/) text editor and the "R" Bundle. All of which are nifty, needless to say, but that's beside the point for now :-)
I've got a .RProfile file sitting in my default "~" startup directory, and it's got a number of useful functions in it I like to have access to when writing R scripts. But I also use Textmate for most of my writing, and the cmd-R functionality to to run my scripts within Textmate.
At the moment, I don't know how to tell Textmate where my .Rprofile is.
Is there a way--most likely through Textmate's Bundle settings--that I can point Textmate to my .RProfile so I don't have to write my functions into every script on a per-script basis?
OR
Is it actually better to include any custom functions in any script I write, so that anyone with a basic R setup can source and run my scripts?
I feel like I must be missing a dead-easy setting or config file here within either Textmate or the R environment it calls to run my scripts.
Thanks so much in advance!
The R Bundle Developer is apparently working on this (see this Post on the Mailing List) but it's not available at the moment.
In the meantime, you have a couple of choices.
First, you can create a new bundle (e..g, "briandk-R") then create a snippet w/in that bundle either with 'source($1)' or just hardcode the file you want to source instead of the placeholder (so, e.g., "source("~/some_file_to_source.R"). If you do the latter, then you can configure TM to source your file via a tab trigger (in the Bundle Editor, toggle over to 'settings' (upper left hand corner) and type "source.r, source.rd.console" in the 'Scope Selector' field then choose a few letters for your tab trigger (e.g., "src.")
If you don't want to do that, go to the 'Rdaemon' Directory (which is either in your home directory or in ~/Library/Application Support/Rdaemon). Look in this directory and you will see another directory called "daemon.' In there is a file called "start.r" which lists the files that are sourced upon starting R from the Rdaemon. You know what to do from there. (Note: This directory also contains a couple of other scripts which contain initial settings; you might wish to have a look at those as well)
The first part of Doug's response offers the simplest immediate solution... add
source('/Users/briandk/.Rprofile')
to the head of any .r files you want those functions in... with that one line of code, you get your utility functions. Of course, that only helps if you're running the whole TM file.
Ideally, the bundle will be updated... perhaps to support a shell variable via TM's preferences???
TM_RPROFILE
which could be set to the path to your .Rprofile file.
I just hacked this into tmR.rb with just 2 lines of code. To implement this, go to ~/Library/Application Support/TextMate/Pristine Copy/Bundles/ and Show the Contents of R.tmbundle
In there, you'll find support/tmR.rb
in my version, near line 112, you should change
tmpDir = File.join(ENV['TMP'] || "/tmp", "TM_R")
recursive_delete(tmpDir) if File.exists?(tmpDir) # remove the temp dir if it's already there
Dir::mkdir(tmpDir)
# Mechanism for dynamic reading
# stdin, stdout, stderr = popen3("R", "--vanilla", "--no-readline", "--slave", "--encoding=UTF-8")
stdin, stdout, stderr, pid = my_popen3("R --vanilla --slave --encoding=UTF-8 2>&1")
# init the R slave
stdin.puts(%{options(device="pdf")})
stdin.puts(%{options(repos="#{cran}")})
to
tmpDir = File.join(ENV['TMP'] || "/tmp", "TM_R")
recursive_delete(tmpDir) if File.exists?(tmpDir) # remove the temp dir if it's already there
Dir::mkdir(tmpDir)
rprofile = (ENV['TM_RPROFILE'] == nil) ? "" : "source('" + ENV['TM_RPROFILE'] + "')"
# Mechanism for dynamic reading
# stdin, stdout, stderr = popen3("R", "--vanilla", "--no-readline", "--slave", "--encoding=UTF-8")
stdin, stdout, stderr, pid = my_popen3("R --vanilla --slave --encoding=UTF-8 2>&1")
# init the R slave
stdin.puts("#{rprofile}")
stdin.puts(%{options(device="pdf")})
stdin.puts(%{options(repos="#{cran}")})
Just added 2 lines there... the one that begins "rprofile =" and the one that includes "#{rprofile}"
-Wil