Yocto: how to remove a layer without rebuild all - qt

I'm playing with a Yocto project that has in its conf/bblayers.conf file the following line:
ADDONSLAYERS += "${#'${OEROOT}/layers/meta-qt5' if os.path.isfile('${OEROOT}/layers/meta-qt5/conf/layer.conf') else ''}"
I partially bitbaked the project but now I want to try to disable the whole meta-qt5 layer.
After commenting out the line above, how to remove the already built files from the output folder and go on with the others?
I tried with bitbake -c cleansstate meta-qt5 but it doesn't work. I guess it works with recipes only, and not with whole layers.

Easiest way to clean a build is to remove TMPDIR temporary folder (default is <build>/tmp).
That will remove previous compilation results, but those are also kept in SSTATE_DIR cache folder. Next build will not rebuild all, it will reuse cache results to speed it up.
Then, you can clean your cache folder for obsolete entries with sstate-cache-management.sh script:
# Example of usage (after sourcing oe-init-build-env)
sstate-cache-management.sh --cache-dir=../sstate-cache -d -y

Related

dotnet publish output folder?

The dotnet publish command published into the projects bin/netcoreapp2.2/Debug/publish folder. Where netcoreapp2.2 presumably changes with the dotnet version and Debug changes with whatever configuration the -c parameter specifies.
For CI/CD purposes this is clearly undesirable. Alternatively one can pass -o to pass an explicit output path, but again, in a CI/CI environment this path should be inside the project folder structure, e.g. something like:
dotnet publish -o publish
But, because the publish command globs up all files, it picks up previous publish attempts and recursively stores them. This can be mitigated by either cleaning the publish folder explicitly, and/or adding a to the csproj for the project but now there is a dependency between the build script and the csproj: if the publish path is changed in the build scripts for any reason without a corresponding csproj update things break.
So, the least fragile option seems to be to use the default output path as thats automatically excluded from globbing, but how to remove the version & configuration sensitivity? Is there a particularly safe way to get dotnet to tell my CI/CD environment what its output path for build / publish is?
IMP: I do not have enough so reputation to add comment
refer : dotnet publish
you could use relative path with -o option and you may end up avoiding folder name with run-time, platform identification explicitly.
or why do not you consider using build command with publish profile where you can specify explicit path. but generally relative path is less error prone.
Hope this may help you !

Can a pre-commit Git hook zip a directory and add it to the repository?

I'm doing development on a Wordpress plugin. My development directory contains a lot of development-specific stuff (e.g. Grunt files, Sass files, the git repository itself, etc.).
Obviously, I don't want to distribute this folder containing all of those development files; people don't want a few MB of Grunt files when they download my Wordpress plugin.
Up until now, though, my "release" process has been cumbersome:
Commit the Git changes
Zip the entire folder
Open the zip file and delete the .git folder, grunt files, and all the other development-specific files
Release the new zip
I don't know the best way to accomplish this, but I'm very vaguely familiar with Git hooks, and I had this thought: could I set up a Git hook that would zip ONLY the needed production files into a ZIP file and store it with the repo? That way, every time I commit it would automatically create a new release ZIP.
Is that possible? If so, could someone point me in the right direction?
Oh also, I'm on Windows (・_・;). So I'm hoping that there's a way to do it on Windows.
I can't speak for Windows, but:
It's technically possible to do that sort of thing in a pre-commit hook.
Don't.
A pre-commit hook that modifies "what you will commit" is annoying (if nothing else, it violates the "rule of least astonishment", where your version control system simply stores the versions you tell it to store). Apart from that, storing large pre-compressed binaries interferes with git's attempt to save space in pack files, and will cause rapid repository bloat, poor performance, running out of memory, and so on. A ZIP-archive is a pre-compressed binary and hence will behave badly.
In general, a more reasonable "hook-y" way to handle releases is to set up a "release server" to which you push new releases, and have the push trigger the archive-generation. (There are ways to do this without a separate server / repository, and you can do it in a more pull-style fashion, but the push-style is easy to illustrate.)
[Edit: I had originally considered git archive but did not realize you could get it to exclude files conveniently, so wrote up the below instead. So, jthill's answer is better and should be one's first resort. I'll leave this in place as an alternative for some case where for some reason, git archive might not do.]
For instance, here's a server-side post-receive hook code fragment that checks whether a branch whose name matches release* has been pushed-to, and if so, invokes a shell function with the name of the branch (once for each such branch):
#! /bin/sh
NULL_SHA1=0000000000000000000000000000000000000000
scan()
{
local oldsha newsha fullref shortref
local optype
while read oldsha newsha fullref; do
case $oldsha,$newsha in
$NULL_SHA1,*) optype=create;;
*,$NULL_SHA1) optype=delete;;
*) optype=update;;
esac
case $fullref in
refs/heads/*)
reftype=branch
shortref=${fullref#refs/heads/}
;;
*)
reftype=other
shortref=fullref
;;
esac
case $optype,$reftype,$shortref in
create,branch,release*|update,branch,release*)
do_release $shortref;;
esac
done
}
scan
(much of the above is boilerplate, which I have stripped down to essentials). You would have to write the do_release function, which might resemble (totally untested):
do_release()
{
local tmpdir=/tmp/build.$$ # or use mktemp -d
# $tmpdir/index is git's index; $tmpdir/t is the work tree
trap "rm -rf $tmpdir; exit 1" 1 2 3 15
rm -rf $tmpdir
mkdir $tmpdir/t
GIT_INDEX_FILE=$tmpdir/index GIT_WORK_TREE=$tmpdir/t git checkout $1
# now clean out grunt files and make zip archive
(cd $workdir/t; rm -rf grunt; zip ../t.zip .)
# put completed zip archive in export location, name it
# based on the branch name
mv $workdir/t.zip /place/where/zip/files/live/$1.zip
# clean up temp dir now, and no longer need to clean up
# on signal related abort
rm -rf $tmpdir
trap - 1 2 3 15
}
There's actually a command for this, git archive.
git archive master -o wizzo-v1.13.0.zip
See the EXAMPLES section, you can select paths, add prefixes to them, define custom postprocessing by output extension, and some more minor tweaks.
Also see the ATTRIBUTES section: you can give files -- arbitrary patterns, really -- an export-ignore attribute to exclude them from archives.
It's got a bunch more handy-dandies, you can get archives from remote repos, expand arbitrary git log --pretty=format: placeholders, the git manpages are definitely worth whatever time you can invest in them.

Git Archive but first put all the files inside a folder then start archiving

As title suggest, I want to know if there is a single git command that put all my project in one folder first (not including .gitignored files) and then proceed archiving the folder— leaving ignored files not included when archiving which is nice.
This can be beneficial for me as I am working on WordPress plugin with multiple release. Some references.
I want all the files (minus the .gitignored files) move to a folder first then proceed archiving that folder
It is possible in one command provided you define an alias but this isn't git-related:
you can:
clone your repo elsewhere (that way you don't get any ignored or private file)
move your files as you see fit in that local clone
archive (tar cpvf yourArchive.tar yourFolder)
But git archive alone won't help you move those files, which is why I would recommend a script with custom bash commands (not git commands).
You don't really need to copy / clone the repo anywhere.
Make sure you committed all your changes.
Process the files any way you want.
Run tar -cvjf dist/archive-name.tbz2 --transform='s,^,archive-name/,' $(git ls-tree --full-tree -r --name-only --full-name HEAD)
run git reset --hard to restore without any of the changes you made in step #2.
Hints:
The --transform='s,^,archive-name/,' is so your files will be extracted toarchive-name/....`, you can remove it if you don't need that.

Building a library using autotools from cmake

This is my first try with cmake and I would like to have, if possible, some feedbacks about what I did since some problems remain.
In the CMakeLists.txt of the library folder, I created two makefile targets: configure-antlr3c and antlr3c. The first target runs the autotools configuration shell script, the second one runs the make executable to build the library:
# CMakeLists.txt in libantlr3c-3.1.3
add_custom_target(
configure-antlr3c
${SHELL_EXECUTABLE} configure
WORKING_DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR}
)
add_custom_target(
antlr3c
${MAKE}
DEPENDS configure-antlr3c
WORKING_DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR}
)
The main problem is thatconfigure-antlr3c target is always "out of date", so it will always be executed even if no changes happened. Moreover, I necessarily need to generate my cmake makefiles in a separate directory (not in the root directory of my project) to avoid overriding the autotools Makefile of the library...
Has anyone had this problem (building autotools projects with cmake) ? And if so, what have been your solutions ?
Thank you.
EDIT : Solution
In the root CMakeLists.txt:
include(ExternalProject)
ExternalProject_Add(
libantlr3c
SOURCE_DIR ${CMAKE_CURRENT_SOURCE_DIR}/lib/libantlr3c-3.1.3
CONFIGURE_COMMAND ${CMAKE_CURRENT_SOURCE_DIR}/lib/libantlr3c-3.1.3/configure --prefix=${CMAKE_CURRENT_SOURCE_DIR}/lib/libantlr3c-3.1.3
PREFIX ${CMAKE_CURRENT_SOURCE_DIR}/lib/libantlr3c-3.1.3
BUILD_COMMAND make
BUILD_IN_SOURCE 1
)
I think that you'd be better off using the ExternalProject feature of cmake. I guess you have your project and have libantrl in a sub directory?
project
+- libantlr
+- mysrc
---- etc ----
If that's the case, you can do something like this in the top level CMakeLists.txt:
cmake_minimum_required(VERSION 2.8)
project(test)
include(ExternalProject)
ExternalProject_Add(libantlr
SOURCE_DIR ${CMAKE_CURRENT_SOURCE_DIR}/libantlr
CONFIGURE_COMMAND ${CMAKE_CURRENT_SOURCE_DIR}/libantlr/configure --prefix=<INSTALL_DIR>
BUILD_COMMAND ${MAKE})
The <INSTALL_DIR> is expanded to something like libantlr-prefix, so things are installed in your build tree rather than in /usr/local, which is what autotools would do without a prefix.
I needed to do something similar but found it surprisingly difficult to get a working solution, despite the example provided here with the accepted answer, and code snippets provided in several other blog posts, the CMake support email listserv archives, etc. For the benefit of others who come across this question, here is my solution.
The external project we wanted to use is libmodbus, though I believe my solution is general enough to work with any project configured with the standard autoconf recipe of ./autoconf.sh && configure.sh && make && make install.
We wanted to add libmodbus as a submodule of our git repository. We added to our repository at the path <root>/opt/libmodbus. The CMake code to configure it is located in <root>/cmake/modbus.cmake, which is included from our root CMakeLists.txt using
# libmodbus
include(cmake/modbus.cmake)
The content of cmake/modbus.cmake is:
include(ExternalProject)
set(MODBUS_DIR ${CMAKE_CURRENT_SOURCE_DIR}/opt/libmodbus)
set(MODBUS_BIN ${CMAKE_CURRENT_BINARY_DIR}/libmodbus)
set(MODBUS_STATIC_LIB ${MODBUS_BIN}/lib/libmodbus.a)
set(MODBUS_INCLUDES ${MODBUS_BIN}/include)
file(MAKE_DIRECTORY ${MODBUS_INCLUDES})
ExternalProject_Add(
libmodbus
PREFIX ${MODBUS_BIN}
SOURCE_DIR ${MODBUS_DIR}
DOWNLOAD_COMMAND cd ${MODBUS_DIR} && git clean -dfX && ${MODBUS_DIR}/autogen.sh
CONFIGURE_COMMAND ${MODBUS_DIR}/configure --srcdir=${MODBUS_DIR} --prefix=${MODBUS_BIN} --enable-static=yes --disable-shared
BUILD_COMMAND make
INSTALL_COMMAND make install
BUILD_BYPRODUCTS ${MODBUS_STATIC_LIB}
)
add_library(modbus STATIC IMPORTED GLOBAL)
add_dependencies(modbus libmodbus)
set_target_properties(modbus PROPERTIES IMPORTED_LOCATION ${MODBUS_STATIC_LIB})
set_target_properties(modbus PROPERTIES INTERFACE_INCLUDE_DIRECTORIES ${MODBUS_INCLUDES})
A component that uses libmodbus can declare its dependency as usual:
add_executable(hello_modbus main.cpp)
target_link_libraries(hello_modbus modbus)
A few notes:
This abuses the DOWNLOAD_COMMAND to perform the autogen.sh step. The git clean -dfX is probably not necessary (it is a leftover from an earlier version that used the BUILD_IN_SOURCE option. If you really want to download the code instead of using a git submodule, you'll need to modify this line appropriately.
We go to the trouble to force a static-only build of the library. Adjust your configure command line if you want shared libraries.
The set_target_properties command to set the IMPORTED_LOCATION will fail without the BUILD_BYPRODUCTS ${MODBUS_STATIC_LIB} declaration.
Likewise, the set_target_properties command to set the INTERFACE_INCLUDE_DIRECTORIES will fail without the file(MAKE_DIRECTORY ${MODBUS_INCLUDES}).

Qt qmake - how to stop it adding rules to delete the target

I am trying to add a unit test to a group of other tests. All the tests are in their own subdirectories, each with it's own .pro file and the .cpp file which contains the tests themselves. Running qmake in one of the subdirectories creates a Makefile, and then running make runs the compiler to make the TARGET. The tests are actually run by the 'check' target - ie with 'make check'.
The test I'm trying to add is different, but it is trying to pretend to behave the same way.
It is different because it is a perl script and so doesn't need to be compiled. It does, however, need to be run - so 'make check' needs to work.
I had a .pro file working for the most part - 'qmake', 'make', 'make check', and 'make clean' would work, but 'make distclean' removed my script (since it assumes it can be regenerated by compiling something).
So, the question is, how do I stop it from removing my script?
Perhaps there's some other approach I should be taking. I had tried the 'subdirs' TEMPLATE, but that does more than just remove the line in Makefile that deletes the TARGET.
Ideas?
Using Ubuntu Linux with Qt 4.6.0.
I would look into the custom target capabilities for your script. Maybe something like this:
check.commands = <scriptname>
check.depends = <any dependencies>
QMAKE_EXTRA_TARGETS += check
Doing things this way will run the check command when the dependencies change, but as long as you don't specify check.target then it shouldn't remove anything. (If your script does produce output, then perhaps that should be in check.target.) Also, since it is specified as an "extra" command, qmake shouldn't create commands to delete your script in a distclean.
This is assuming that your script is in its own subdirectory (which you state), and is the only "check" command that needs run in that subdirectory (kind of implied by the question, but not directly stated).

Resources