VIM: how to get the file path/directory of opened buffer and do something? - dictionary

my scenario is: I'm using vim to open some .cpp files, for example
vim 1.cpp src/2.cpp root/src/3.cpp
Sometimes, I wish to rebuild 3.cpp so I have to use another window to
"rm root/src/3.o"
and inside vim, type
":make"
This works fine, NP. But I am looking for a .vimrc function/command that:
When I switch to buffer, e.g. "root/src/3.cpp" and press this command, vim will detect the directory of "root/src" and the file name without suffix "3", and automatically execute a command of "rm root/src/3.o".
In this case, I can casually switch to any buffer and re-trigger the build of this very file.
Note I don't wish to map gmake tool command like "make clean" because we use several different make utilities like scons, cmake, etc.
So how to write this function/command in .vimrc? Thanks.

:call system('rm '.expand('%:p:r')) as #Kent said, or even simply :!rm %:p:r.
But I'm quite surprised you need to do that. Tools in charge of compilation chains usually understand dependencies (which ever the tool is), and you shouldn't need to remove the object file that often to need a mapping to do it for you.
PS: it's perfectly possible (but I need to update the doc) to support CMake, or out-of-source compilation from vim. But indeed, with out-of-sources compilation, you wouldn't need to delete those files manually, a :make clean if :make already works.

you can get root/src/3 from root/src/3.cpp buffer by:
expand('%:p:r')
Then you are free to concatenate the .o to end, and build the command.

Related

How to build a rpm that installs host dependent files

I have to build one rpm that copies the contents of file A to /path/to/tartetfile if the hostname is A. In all other cases the contents of B should be copied to /path/to/targetfile. I'm aware that this may be a misusage of rpm, but I still have to do it like this. Do you have any ideas how to get this done in an elegant way?
My solution at the moment would be to create an empty /path/to/targetfile in my BUILD directory as well as a /tmp/contents.tar.gz that contains the files A and B. In the postinstall routine i then would extract the relevant parts of /tmp/contents.tar.gz to /path/to/targetfile and delete the tarball afterwards. In the pre-uninstall routine I'd then touch the /tmp/contents.tar.gz to supress rpm reporting errors for an already deleted file.
To me this seems to be a very dirty way to get this done. Do you have better ones?
If you plan on abusing rpm for things it was not desinged for, you'll have to do dirty tricks.
I don't see another workaround for you. I fail to see the use of removing the tar.gz etc, unless that (little?) extra space is really a problem for you. I would propose:
package all files (A and B) into some specific directory (/usr/lib/your-package or whatever), not in compressed format.
in the %post section create just symlinks so that /path/to/targetfile points to /usr/lib/your-package/A or /usr/lib/your-package/B (symlinks take up almost no space). This has the additional value that ls -l /path/to/targetfile will show you which which file it points to, giving you the information whether this is file A or B.
in your %files section declare %ghost /path/to/targetfile for a nice cleanup upon removal.

Making multiple files from multiple files with one command in gnu make

Assume 1000 files with extension .xhtml are in directory input, and that a certain subset of those files (with output paths in $(FILES), say) need to be transformed via xslt to files with the same name in directory output. A simple make rule would be:
$(FILES): output/%.xhtml : input/%.xhtml
saxon s:$< o:$# foo.xslt
This works, of course, doing the transform one file at a time. The problem is that I want to use saxon's batch processing to do all the files at one time, since, given the number of files, that would be much faster, considering the overhead of loading java and saxon for each file. Saxon allows the -s (source) option to be a directory and processes all files in that directory, placing the results with the same name in the directory specified in the -o: option.
I'm aware of the well-known technique to get GNU make to do a single command to update multiple files by using pattern rules:
output/%.xhtml: input/%.xhtml
saxon s:input -o:output foo.xslt
But in my case this suffers from two problems. First, it will run the transform on all files in the input directory, not just the ones that have changed; and second, it will not limit the transform to the subset of files specified in $(FILES). The GNU make feature of running a recipe given in a pattern rule only once for all matched targets does not work in the case of so-called "static pattern rules" (see [here]), as the rule given at the top of the post is known.
In order to use the saxon batching feature, I need to create a temporary directory, copy to it only those files to be processed, then run the transform with that temporary directory as the input directory. I tried creating a temporary directory, and remember its name using a target-specific variable for future use, using
$(FILES): TMPDIR:=$(shell mktemp -d)
but this creates a new temporary directory for every single target that is out-of-date. In any case, I'm not sure how to structure the rule that would then copy the necessary files into that directory. I don't want to create the temporary directory at the time the makefile is parsed, since I have a non-recursive make system that will parse all make files, even those not related to the current top-level target, and don't want to create the temporary directory for situations in which it is not necessary/will not be used.
I'm well aware that many questions have been asked on SO in the past about creating multiple files from a single input; one solution is (non-static) pattern rules; other solutions involve phony targets. However, in this case I'm stuck as to how to put all this together.
I can identify the files that changed and copy them using the static pattern rule
$(FILES): output/%.xhtml : input/%.xhtml
TMPDIR=`mktemp -d`
cp $< $(TMPDIR)
but actually I would prefer to copy the files with a single cp command, whereas this copies them one by one. Perhaps there is some application here of cp -u?
I also considered using an ad-hoc extension for those files needing updating but could not see how to get this to work either. I'm about to give up and just run the saxon transform on all files when any of them have changed, but is there any better way?
Personally, I wouldn't try to do this from the command line. That's partly because I'm not a shell scripting wizard. I'm not an Ant wizard either, but because the requirement is to process files that haven't changed, this seems to fall very much into Ant territory. On the other hand, Ant will recompile the stylesheet for each transformation, which is an overhead you might want to avoid; if that's the case then your best bet is probably to write a little Java application. It's probably only 100 lines or less.
Final possibility is to do the processing within Saxon: that is, a single transformation that reads multiple input files using the collection() function and generates multiple result files using xsl:result-document. Saxon (commercial editions) offers an extension function last-modified that allows you to filter the files to be processed. With 1000 files you might also want the extension function saxon:discard-document() to prevent the heap filling.
Personally, I like your original one-compiler-per-file formulation. Does not this work well with make's -j n flag?
You can of course batch up files by copying, and then running saxon at the end. Recursive make (ugh!) can sort out the ordering. Something like:
.PHONY: all
all:
rm -rf tmpdir
${MAKE} tmpdir/sentinel
saxon -s:tmpdir -o:output foo.xslt
tmpdir/sentinel: $(FILES) ; touch $#
$(FILES): output/%.xhtml: input/%.xhtml
ln $< $(patsubst input/%,tmpdir/%,$<)
This does work, though I am very queasy about lying to make (the static pattern rule purports to create the target in output/, but in fact does its dirty deed in tmpdir/).
Note in the recipe for tmpdir/sentinel, that $? is correctly set to the list of output files that are out of date. This might be useful if you can pass a bunch of files to saxon rather than a folder.
I think one issue here is that 'saxon' supports either one file or all files in a directory, so isn't suitable for batch processing without copying to temporary directories.
Otherwise, this is quite simple to do by using a timestamp marker file as a proxy target. For example:
output/.timestamp : $(FILES)
mkdir -p $(#D)
$(COMMAND) -outputdir=output $?
touch $#
The three commands are:
Ensure that the output directory exists.
Run the batch command on files newer than the timestamp file.
Update the timestamp file (creating it if necessary).
Remembering that each line of a command is executed in its own subshell, and that if any command line fails, then subsequent lines are not invoked.
This approach is useful with Java builds.

Copy files before compilation

I've a master project with many sobprojects, that I compile using qmake.
In a sub-project I must copy some files before compilation (some header file). I've seen some command to perform operation before and after linking, but I'd like to know if it's possible to perform some shell operation before start compilation. I can't refer to them, but I must to copy them (don't ask why please, it's not my fault :-( ). Any suggestion?
Thanks in advance for your replies.
see my last answer on nearly the same question:
Copy some file to the build directory after compiling project with Qt
the only difference for you is to change in point 5:
POST_TARGETDEPS += copyfiles ## copy files after source compilation
to:
PRE_TARGETDEPS += copyfiles ## copy files before source compilation
when executing qmake there have to exist all files already in filesystem before
I think what you want to do can be accomplished with careful use of the QMAKE_EXTRA_COMPILERS and QMAKE_EXTRA_TARGETS variables. The Qt Labs article The Power of QMake gives a reasonable introduction to it. The ".commands" part of the extra compiler can be any arbitrary command, including a shell command.
The other suggestion I found in this e-mail exchange is to "... take a look at mkspecs/features/*.prf - especially those of moc and uic.." for other possible ways to do it.
I also just played around with QMAKE_EXTRA_TARGETS to solve the question, but could not manage to get it done ;)
One other (simple) solution which might work for you is to wrap the call to gcc/g++: in the .pro file, set QMAKE_CXX=./g++Wrapper and in the g++Wrapper shell script, call the original compiler while doing anything you want before and after the call:
#!/bin/bash
DoWhateverYouWantBeforeCompilation
g++ $*
DoWhateverYouWantAfterCompilation
By evaluating the command line parameters, you could also restrict your actions to specific files.

How do I get vim's :sh command to source my bashrc?

Whenever I start a shell in vim using :sh, it doesn't source my ~/.bashrc file. How can I get it to do this automatically?
See :help 'shell'. You can set this string to include -l or --login, which will source your .bashrc file. So, you might have a line like this in your .vimrc:
set shell=bash\ --login
Note that this will alter everything that invokes the shell, including :!. This shouldn't be much of a problem, but you should be aware of it.
The value of this command can also be changed by setting the $SHELL environment variable.
If it doesn't source your .bashrc file, it may still source your .bash_profile file. I usually make one of them a symlink to the other. If your .bashrc performs some particularly odd one-time operations, you may have to edit it to only perform those operations with a login shell, but I've never had problems with it.
~/.vimrc
cmap sh<CR> !bash --login<CR>
If you quickly enter "sh<Enter>" in command-line, you can start bash with sourcing ~/.bashrc. So dirty.

Qt qmake - how to stop it adding rules to delete the target

I am trying to add a unit test to a group of other tests. All the tests are in their own subdirectories, each with it's own .pro file and the .cpp file which contains the tests themselves. Running qmake in one of the subdirectories creates a Makefile, and then running make runs the compiler to make the TARGET. The tests are actually run by the 'check' target - ie with 'make check'.
The test I'm trying to add is different, but it is trying to pretend to behave the same way.
It is different because it is a perl script and so doesn't need to be compiled. It does, however, need to be run - so 'make check' needs to work.
I had a .pro file working for the most part - 'qmake', 'make', 'make check', and 'make clean' would work, but 'make distclean' removed my script (since it assumes it can be regenerated by compiling something).
So, the question is, how do I stop it from removing my script?
Perhaps there's some other approach I should be taking. I had tried the 'subdirs' TEMPLATE, but that does more than just remove the line in Makefile that deletes the TARGET.
Ideas?
Using Ubuntu Linux with Qt 4.6.0.
I would look into the custom target capabilities for your script. Maybe something like this:
check.commands = <scriptname>
check.depends = <any dependencies>
QMAKE_EXTRA_TARGETS += check
Doing things this way will run the check command when the dependencies change, but as long as you don't specify check.target then it shouldn't remove anything. (If your script does produce output, then perhaps that should be in check.target.) Also, since it is specified as an "extra" command, qmake shouldn't create commands to delete your script in a distclean.
This is assuming that your script is in its own subdirectory (which you state), and is the only "check" command that needs run in that subdirectory (kind of implied by the question, but not directly stated).

Resources