I would like to be able to create a custom override to add a VcpkgConfiguration Property based on our current configuration.
We have a C++ project that uses Premake and vcpkg. We have found vcpkg to conflict with other projects that include their own versions of similar libraries, so we cannot use the global integration that it provides. Instead we have added it as a sub-module to our project and linked it through premake with a custom override:
p.override(p.vstudio.vc2010, "importExtensionTargets", function(base, prj)
p.push('<ImportGroup Label="ExtensionTargets">')
p.callArray(p.vstudio.vc2010.elements.importExtensionTargets, prj)
p.pop('</ImportGroup>')
p.push('<ImportGroup Label="ExtensionTargets">')
p.w('<Import Project="$(SolutionDir)External/vcpkg/scripts/buildsystems/msbuild/vcpkg.targets"/>')
p.pop('</ImportGroup>')
end)
Unfortunately we do not use the regular "Debug" or "Release" configurations in our project, so vcpkg by default does not link correctly. To get past that problem, we modified the vcpkg.targets file to recognize our configuration in a local branch. This is not ideal, as it forces us to rebase our branch off vcpkg in order to update it, and could potentially conflict if that file is ever modified in their repo.
The targets file allows you to set the VcpkgConfiguration property before including the target, which is what we would like to do.
Basically what we would like is to be able to call a command through the filters like this:
filter {"configurations:<SomeConfiguration>"}
VcpkgConfig "Debug"
Which would add this inside the propertygroup
<VcpkgConfiguration>Debug</VcpkgConfiguration>
How can we accomplish this?
The problem seems to be that importExtensionTargets is per project but you want this per configuration.
You can try to register your key word
api.register {
name= "VcpkgConfig",
scope = "config",
kind = "string",
}
then in your custom function
-- loop over all configurations
for _, cfgName in ipairs(prj.configurations) do
-- find config
local cfg = project.findClosestMatch(prj, cfgName)
if cfg.VcpkgConfig then
p.push('<ImportGroup Label="ExtensionTargets">')
p.push('<VcpkgConfiguration>'.. cfg.VcpkgConfig .. '</VcpkgConfiguration>')
p.w('<Import Project="$(SolutionDir)External/vcpkg/scripts/buildsystems/msbuild/vcpkg.targets"/>')
p.pop('</ImportGroup>')
end
Not tested.
Would this work ?
Related
One of the DDEV sites I manage uses a database that includes a prefix. The default behavior for DDEV is to recreate the settings.ddev.php on every start. But that obviously overwrites anything added, purging any manual addition of the prefix.
Is the assumed solution to stop DDEV from overwriting the file? Or to create another settings file (like settings.local.php) to override what's been overridden? Or am I missing something?
This just seems like something that would exist as a simple variable in the config to generate a more accurate settings.ddev.php file. Thanks!
There are a few straightforward answers:
Don't let ddev fiddle with settings at all. Change the project type to 'php' and ddev won't mess with it.
Make the changes you want to db settings in settings.php after the inclusion of settings.ddev.php. That should work no matter what. And it should work on your prod site as well.
Do the work in settings.local.php, but include it after settings.ddev.php in your settings.php file
Take over settings.ddev.php and do whatever you want with it. This just means deleting the line that contains #ddev-generated in settings.ddev.php. After that, ddev won't muck with it at all.
I decided to use a version of the second suggestion:
// Automatically generated include for settings managed by ddev.
$ddev_settings = dirname(__FILE__) . '/settings.ddev.php';
if (getenv('IS_DDEV_PROJECT') == 'true' && is_readable($ddev_settings)) {
require $ddev_settings;
$databases['default']['default']['prefix'] = "drupal_";
}
I just added the $databases line. The rest was already there.
I'm using Flow to help author a JS project. If I want to provide a libdef file to supplement it do I need to create it manually, or am I able to execute some magic command that I'm not aware of yet which will generate the lib def for me?
Something like $ flow-typed doyourmagic would be nice.
EDIT:
Found this https://stackoverflow.com/a/38906578/192999
Which says:
There's two things:
If the file is owned by you (i.e. not a third party lib inside node_modules or such), then you can create a *.js.flow file next to it that documents its exports.
If the file is not owned by you (i.e. third party lib inside node_modules or such), then you can create a libdef file inside flow-typed/name-of-library.js
For .js.flow files
you write the definitions like this:
// #flow
declare module.exports: { ... }
For libdef files you write the definitions like this:
declare module "my-third-party-library" { declare module.exports: {... } }
For my question I fall into the "is owned by you" camp.
I guess I'm confused as to:
How I write these files.
How/where I publish these files to package it up for another project to reference.
Also, why do I need to create the .js.flow file manually? Can this not be magically generated? Perhaps that's the intention going forward but not implemented yet.
I found a nice guide showing how to package flow code together with the compiled code. So:
You do not have to write your own libdefs, you can use the entire flow source code. If you want a definition with only the type declarations, you can look into flow gen-flow-files, although that is still experimental and might fail.
You can package them as *.js.flow and the flow checker will automatically pick those up when you import your library.
What are the strategies to embed a unique version number in a Spring application?
I've got an app using Spring Boot and Spring Web.
Its matured enough that I want to version it and see it displayed on screen at run time.
I believe what you are looking for is generating this version number during build time (Usually by build tools like Ant, Maven or Gradle) as part of their build task chain.
I believe a quite common approach is to either put the version number into the Manifest.mf of the produced JAR and then read it, or create a file that is part of the produced JAR that can be read by your application.
Another solution would be just using Spring Boot's banner customization options described here: http://docs.spring.io/spring-boot/docs/current/reference/html/boot-features-spring-application.html#boot-features-banner
However, this will only allow you to change spring-boot banner.
I also believe that Spring Boot exposes product version that is set in Manifest.MF of your application. To achieve this you will need to make sure Implementation-Version attribute of the manifest is set.
Custom solution for access anywhere in the code
Lets assume you would like to have a version.properties file in your src/main/resources that contains your version information. It will contain placeholders instead of actual values so that these placeholders can be expanded during build time.
version=${prodVersion}
build=${prodBuild}
timestamp=${buildTimestamp}
Now that you have a file like this you need to fill it with actual data. I use Gradle so there I would make sure that processResources task which is automatically running for builds is expanding resources. Something like this should do the trick in the build.gradle file for Git-based code:
import org.codehaus.groovy.runtime.*
import org.eclipse.jgit.api.*
def getGitBranchCommit() {
try {
def git = Git.open(project.file(project.getRootProject().getProjectDir()));
def repo = git.getRepository();
def id = repo.resolve(repo.getFullBranch());
return id.abbreviate(7).name()
} catch (IOException ex) {
return "UNKNOWN"
}
}
processResources {
filesMatching("**/version.properties") {
expand (
"prodVersion": version,
"prodBuild": getGitBranchCommit(),
"buildTimestamp": DateGroovyMethods.format(new Date(), 'yyyy-MM-dd HH:mm')
)
}
}
processResources.outputs.upToDateWhen{ false }
In the code about the following is happening:
We defined a function that can take a build number out of the VCS
(in this case Git). The commit hash is limited to 7 characters.
We configure the processResources task to process
version.properties file and fill it with our variables.
prodVersion is taken from Gradle project version. It's usually set
as version in gradle.properties file (part of the general build
setup).
As a last step we ensure that it's always updated (Gradle
has some mechanics to detect if files ened to be processed
Considering you are on SVN, you will need to have a getSvnBranchCommit() method instead. You could for instance use SVNKit or similar for this.
The last thing that is missing now is reading of the file for use in your application.
This could be achieved by simply reading a classpath resource and parsing it into java.util.Properties. You could take it one step further and for instance create accessor methods specifically for each field, e.g getVersion(), getBuild(), etc.
Hope this helps a bit (even though may not be 100% applicable straight off)
Maven can be used to track the version number, e.g.:
<!-- pom.xml -->
<version>2.0.3</version>
Spring Boot can refer to the version, and expose it via REST using Actuator:
# application.properties
endpoints.info.enabled=true
info.app.version=#project.version#
Then use Ajax to render the version in the browser, for example using Polymer iron-ajax:
<!-- about-page.html -->
<iron-ajax auto url="/info" last-response="{{info}}"></iron-ajax>
Application version is: [[info.app.version]]
This will then show in the browser as:
Application version is: 2.0.3
I'm sure you've probably figured something out since this is an older question, but here's what I just did and it looks good. (Getting it into the banner requires you to duplicate a lot).
I'd recommend switching to git (it's a great SVN client too), and then using this in your build.gradle:
// https://github.com/n0mer/gradle-git-properties
plugins {
id "com.gorylenko.gradle-git-properties" version "1.4.17"
}
// http://docs.spring.io/spring-boot/docs/current/reference/html/deployment-install.html
springBoot {
buildInfo() // create META-INF/build-info.properties
}
bootRun.dependsOn = [assemble]
And this in your SpringBoot application:
#Resource
GitProperties props;
#Resource
BuildProperties props2;
Or this way to expose those properties into the standard spring environment:
#SpringBootApplication
#PropertySources({
#PropertySource("classpath:git.properties"),
#PropertySource("classpath:META-INF/build-info.properties")
})
public class MySpringBootApplication {
and then referencing the individual properties as needed.
#Value("${git.branch}")
String gitBranch;
#Value("${build.time}")
String buildTime;
I have a few gradle war tasks in my build file, and I would like to change the webAppDirName per war task. I tried this:
task myWarTask(type: War) {
ext.webAppDirName = 'src/anotherfolder/webapp' // also tried just webAppDirName
version ""
destinationDir = file("$buildDir/libs")
baseName = 'myWarName'
classpath = configurations.myWarConfiguration
}
But this is still pulling in the contents of src/main/webapp instead of src/anotherfolder/webapp
Can I configure the webAppDirName on a per war file basis like this?
There is just one webAppDirName property per project, and the War plugin automatically adds a corresponding from to each War task. So the main problem is how to undo that from. I think the following should work:
apply plugin: "war"
webAppDirName = "non/existing/dir"
task myWarTask(type: War) {
from "src/anotherfolder/webapp"
...
}
An alternative is to only use the War task type, but not the War plugin. You'll have to configure a few more task properties then, and will lose a few features, mostly related to provided configurations and publishing of the War. Of course you can make up for this with explicit configuration (if necessary). If you are interested in the details, have a look at the source code for the War plugin.
PS: webAppDirName is not an extra property (ext.), but a convention property added by the War plugin. Extra properties are only meant for ad-hoc use in build scripts. You'd use ext. when writing an extra property, but omit it when reading the property.
task myWarTask(type: War) {
from 'src/anotherfolder/webapp'
version ""
destinationDir = file("$buildDir/libs")
baseName = 'myWarName'
classpath = configurations.myWarConfiguration
}
I'm writing an nginx module.
From looking at other examples I'm registering my header filter in my modules postconfiguration hook:
static ngx_int_t
mod_py_postconfig(ngx_conf_t *cf)
{
ngx_http_next_header_filter = ngx_http_top_header_filter;
ngx_http_top_header_filter = mod_py_headers_filter;
return NGX_OK;
}
But the handler is never called. I've set a breakpoint in gdb on ngx_http_top_header_filter change and it seems my module's postconfig is called first, but then runs postconfig of the ngx_http_write_filter_module which overrides ngx_http_top_header_filter w/o storing the old value:
static ngx_int_t
ngx_http_write_filter_init(ngx_conf_t *cf)
{
ngx_http_top_body_filter = ngx_http_write_filter;
return NGX_OK;
}
it seems like it is designed to be the very last on called, so how come my module's postconfig is called first?
From what I can see the order of modules is set in objs/ngx_modules.c
I was able to fix the problem by manually reordering the modules there so that my module comes after ngx_http_header_filter_module, but this feels like an ugly hack, and also makes it hard to automate build process as ./configure overwrites this file each time.
OK, so I figured it out myself. Documenting it here in case anyone else will need it.
I was adding my module to the wrong list. The nginx module is configured through a 'config' file insed module's directory. My had the following line in it:
HTTP_MODULES="$HTTP_MODULES ngx_http_my_module_name"
I searched for HTTP_MODULES usage and found nginx/auto/modules script which actually builds ngx_modules.c file. It turns out there are several possible module lists used by nginx/auto/modules. I needed to add my module to the HTTP_AUX_FILTER_MODULES list like so:
HTTP_AUX_FILTER_MODULES="$HTTP_AUX_FILTER_MODULES ngx_http_my_module_name"
This placed my module at the right place just after HTTP_HEADERS_FILTER_MODULE and fixed the problem.