I am trying to get better understanding how renv package in R works and how it interacts with git. Here are my questions
Assume I have master and a couple git branches in my R projects for each (master and branches) I would like to use different environments (different libraries or different versions of the same libraries). Would renv be able to handle it, i.e. if I switch from one branch to another will need to call renv::restore().
I have two separate projects with renv running in both of them, call them project A and project B. I would like to take environment from project B and replace environment in project A. How can I accomplish it? Do I just need to copy renv folder from one project to another?
Assume I have master and a couple git branches in my R projects for each (master and branches) I would like to use different environments (different libraries or different versions of the same libraries). Would renv be able to handle it, i.e. if I switch from one branch to another will need to call renv::restore().
renv excludes the project library from version control (to avoid bloating the repository size), so in normally you would be required to restore the library path when switching branches.
This is a bit arduous, so another solution would be to configure renv to use a different library path for each branch for your git repository. You could accomplish this with something like (in your project .Rprofile):
branch <- system("git rev-parse --abbrev-ref HEAD", intern = TRUE)
Sys.setenv(RENV_PATHS_LIBRARY = file.path("renv/library/branches", branch))
This way, renv would be automatically configured to use a separate library for each branch, and this would happen automatically as you switch branches.
This seems like something that would be useful to have in general; you might consider filing a feature request at https://github.com/rstudio/renv/issues if so.
I have two separate projects with renv running in both of them, call them project A and project B. I would like to take environment from project B and replace environment in project A. How can I accomplish it? Do I just need to copy renv folder from one project to another?
That should suffice, although it depends on how much of the environment you want to copy. The renv folder contains a settings.dcf file, defining project settings -- you may or may not want to copy those settings over as well. (See ?renv::settings for documention on renv's project-specific settings.)
Alternatively, you could copy the project renv.lock from project B to project A, and then call renv::restore(). This might be more appropriate if you were e.g. copying a project from one machine to another, especially if those machines were running on different operating systems.
Related
I am trying to create Team City build template which requires minimum customisation, and I want it to play nicely with legacy projects and projects developed with .NET Core/Standard and .NET CLI.
I am stuck with NuGet as there were some considerable changes in how things work.
Earlier we had to create nuspec file to pack project as a NuGet package. At least in that file we could define various package-related properties.
New csproj file format allows us to define all package properties inside itself. That's fine, but how then do we know which projects should be packaged and which should not?
So far our TeamCity's build step Pack NuGet just contained **.nuspec for Specification files: field. The very fact of nuspec file presence served like a flag pack & publish this project.
However, for dotnet pack we need to specify the project. There is no simple way to distinguish 'main' projects from 'auxiliary' ones on which mains depend. (Let us ignore that project-to-project references are currently not supported.)
We either could pack all projects specifying **.*proj (yet in that case we are to know which packages to publish) or we might specify projects explicitly in a build configuration, but I don't like this approach because you must edit build configuration each time new project is added to the solution.
I also considered the option Generate package on build and omit dot net pack step as package is created on build. The only thing left is publishing the packages with dotnet nuget push specifying **/%BuildConfiguration%/*.nupkg.
Unfortunately when starting build against solution without projects with enabled Generate package on build makes TC fail complaining that
Target files not found for pattern "**/Release/*.nupkg"
Hence, I either need another recipe for achieving the required result or an advice how to make TC consider empty result just as a NOP and mark build as successful.
Yet another option is to use nuspec even for new csproj...
Since TeamCity 2017.2 will be available option to associate build configuration with multiple templates. So you will be able to create different templates to create packages for old projects and new .NET CLI projects.
To specify paths for target .NET projects, which should be packaged, you could use build configuration parameters.
To set such parameters during the build you could send in the preceding build step service message. The value of this parameter could be set to the list of target project files which could be selected via script like that: https://stackoverflow.com/a/8153857/305875
If you want to use a package between two projects, what's the best way to handle it. Considering two scenarios :-
First Scenario
Git Repository with the two projects like
root folder
-- Mobile App Folder
-- Web Folder
So both projets are in the same repository
Second Scenario
Each project is in separate Git repositories and we want to share the package between those projects.
What's a good ways to handle each scenario? ( Either using the same method for both, or different methods for each scenario)
You need to be aware of how Meteor handles package scanning when confronted with meteor add package :
searching for it inside the local packages/ folder of your app.
searching for it inside every folder specified in the PACKAGE_DIRS environment variable.
searching for it on Atmosphere.
Not sure about the specific order but I'm assuming the one that makes most sense.
So your question is basically where to store the package for an optimal workflow.
Using the fist scenario, you would store your private packages inside the app root folder under packages/, you'll just have to git pull from the repo to get the latest versions of the packages. Then you would have to make sure to define correctly the PACKAGE_DIRS env variable, something like this :
export PACKAGE_DIRS=$PACKAGE_DIRS:$HOME/meteor/my-repo/packages
Using the second scenario, you would store each private package on its own git repo, then pull them into a local $HOME/meteor/packages of yours and don't forget to set PACKAGE_DIRS appropriately.
export PACKAGE_DIRS=$PACKAGE_DIRS:$HOME/meteor/packages
I would tend to go with the second scenario if there's a chance that these private packages may be reused for other projects, if you are sure they only make sense in a particular project, then storing them along in the repo is OK.
Another option would be to symlink your shared private packages into the "packages" folder of each of your apps.
So assume you have have your shared package in the folder /dev/mysharedpackage. You could create a symlink via ln -s /dev/mysharedpackage packages/mysharedpackage and then add the package via meteor add.
Here is a Meteor Cast on this topic: https://www.meteorcasts.net/ep/3
If you have a project that builds one project before building the next, but the next needs to know the 'path' of the first build, is it possible to get this?
For example:
Project A has Build Configuration A and Build Configuration B.
Build Configuration B has a dependency on Build Configuration A. From without the Build Configuration B it will need access to the path of Build Configuration A. Is there are a way to obtain this?
Most simple approach would be to define a custom checkout directory in the A and use the same hard-coded value in B.
If you use TeamCity snapshot or artifact dependencies, you can use %dep.btXXX.teamcity.build.checkoutDir% to get checkout directory of the dependency build. However, this will not work in 6.5.0-6.5.5 TeamCity versions, see details and workaround in the issue TW-18715.
However, you should really avoid accessing checkout directory of one build from another. If you need sources of A, you can checkout them in B; if you output of the A's build, then publishing the output as build's artifacts and then using TeamCity artifact dependencies is the way to go. In both cases additionally using TeamCity snapshot dependencies will ensure both builds use the same sources snapshot which is probably what you need.
If you have one agent, and only ever one agent then you could try and use the path from a previous build.
I wouldn't recommend doing this however because if you had two agents, or scaled up in the future to two agents, then it is possible your projects will be built on different agents; this would mean your dependency working directory won't be on the same machine, or it will be outdated as the latest was built elsewhere.
I assume you're after the path of the first build to get its output?
If so, the method we use to share dependencies between projects is to checkin the output from each project into our source control, then every project that requires the output simply has to check them out.
I just started working with ant a few days ago. Right now I have a general buildall.xml which should call each project's build.xml. Because some projects depend on each other, I need to rebuild some other projects which depend on it. This isn't a problem--I'm just setting the depends property of the target. However, ant is always building the dependencies, even when the files haven't changed.
Let's say project1 has no dependencies; project2 depends on project1; project3 depends on project1, 2; project4 depends on project1, 2, and 3; and so on.
I could hack a solution which looks at project K, and checks if project 1 .. project K have updated files using uptodate. If so, then run the target. This is messy and appears unnecessary.
What is the cleanest way to implement this?
EDIT: So I decided to just hack in a bunch of targets, "check_projectK" where it does the uptodate checks on all of its source files, its build file, and the build files of the 1 .. K-1 projects. Due to dependencies, this is always handled correctly. However, this is still a large amount of copy and paste for a large workspace. I will leave this open.
Short answer, ANT can't do it, not unless you have some kind of way to connect to your version control system and check if anything has changed (you are using source control right?). Ant doesn't know about when what the last time a file changed and then see if it matches with what was built; it doesn't have the concept of a dependency repository. The whole purpose of Ant is that it just builds.
The solution to your problem isn't Ant, it's Maven. Maven HAS a dependency repository. There's also a very nifty plugin for Maven used specifically with Flex appropriately called FlexMojos. By using this, Maven can know when something was last built because it's uploaded to the repository. Then your other projects can add it's dependencies and download the SWC needed.
On top of that, it mixes great with a continuous integration engine like Hudson, Bamboo and Teamcity, which builds a project every time a file has been committed to your source control system, and then updates all dependent projects automatically!
We're in the process of streamlining/automating build, integration and unit testing as well as deployment.
Our software is developed in Visual Studio where we have use both C# and VB.NET in our projects. A single project can be contained within multiple solutions (i.e. Utils project is used in both ProductA and ProductB solutions)
For historical reasons our code repository isn't as well structured as one could have hoped for.
E.g. Utils project might be located under ProductA solution (because that's were it was first used) but it was later deemed useful for productB development and merely just included into the solution of productB (but still located in a subdirectory of productA).
I would like to use continous integration testing and have setup a CC.NET build server where I intend to use NAnt for creating the actual builds.
Question 1: How should I structure my builds on the buildserver? Should I instruct CC.NET to retrieve all the projects for productB into a single library e.g. a file structure similar to
-ProductB
--Utils
--BetterUtils
--Data
or should I opt for a filestructure similar to this
-ProductA
--Utils
-ProductB
--BetterUtils
--Data
and then just have the NAnt build scripts handle the references? Our references in VS doesn't match the actual location in the code repository so it's not possible today to just check-out productB solution and build it straight away (unfortunately). I hope this question makes sense?
Question 2: Is it better to check out all the source code located in different projects into a single file folder (whilst retaining some kind of structure) and then build every thing at once or have multiple projects in CC.NET and then let the CC.NET server handle dependencies?
Example:
Should I have a seperate project in CC.NET for monitoring the automated build/test of Utils project when it's never released on it's own? Or should I just build/test it whilst building it as part of ProductB?
I hope the above makes sense and that you can provide me with some arguments for using either option. We're nowhere near an ideal source code repository structure and I would prefer if I can resolve the lack of repository structure on the build server instead of having to clean up the structure of our repository.
Switching away from VSS is (unfortunately) not an option.
Right now our build consists of either deploying via VS clickonce or pressing F5 so just getting the build automated would be a huge step up for us.
Thanks
To answer your first question, I would recommend a separate top-level folder for each build project. The problem with having a single tree matching your source repository is that when your build server is trying to run multiple builds at once, one or more will likely fail due to files in use by other processes. Also, you may run into cases where a build script is pulling an older version of the code. In that instance you don't want a different project to accidentally use the incorrect source version.
If your solutions already reference projects from relative paths, you may end up with a structure like this:
-CCNetBuilds
--ProductASource
---Utils
---...
--ProductBSource
---ProductA
----Utils
---ProductB
----BetterUtils
----Data
In this case, the build for Product B contains part of the Product A source, at the same relative path as your solution already expects. This takes a bit more time to set up in CC.Net, but makes it easier to maintain if the developers have their code set up this way on their machines. The same solution files used in development are used by the build server.
To answer your second question, I prefer Utilities being its own build. If I have unit tests on my Utilities assembly, I would not want them to run for every single product that uses the Utilities. Also, if you have a separate build for Utilities, you can set a dependency in CC.Net so that Product A and B will not attempt to build if the Utilities build is broken. This provides a bit faster feedback that something is wrong.